var/home/core/zuul-output/0000755000175000017500000000000015137217126014532 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015137227472015503 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000327363215137227373020300 0ustar corecore.}ikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD -/@}V3Eڤ펯_ˎ6Ϸ7+%f?長ox[o8W56!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kfn!#Šgv cXk?`;'`&R7߿YKS'owHF6":=3Ȑ 3xҝd){Ts}cZ%BdARO#-o"D"ޮrFg4" 0ʡPBU[fi;dYu' IAgfPF:c0Ys66q tH6#.`$vlLH}ޭA㑝V0>|J\Pg\W#NqɌDSd1d9nT#Abn q1J# !8,$RNI? j!bE"o j/o\E`r"hA ós yi\[.!=A(%Ud,QwC}F][UVYE NQGn0Ƞɻ>.ww}(o./WY<͉#5O H 'wo6C9yg|O~ €'} S[q?,!yq%a:y<\tunL h%$Ǥ].v y[W_` \r/Ɛ%aޗ' B.-^ mQYd'xP2ewEڊL|^ͣrZg7n͐AG%ʷr<>; 2W>h?y|(G>2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{V5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁weor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/P_]F@?qr7@sON_}ۿ릶ytoyמseQv^sP3.sP1'Ns}d_ս=f1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b>¾h[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %w ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזL c̖F4BJ2ᮚ苮p(r%Q 6<$(Ӣ(RvA A-^dX?+'h=TԫeVިO? )-1 8/%\hC(:=4< ,RmDRWfRoUJy ŗ-ܲ(4k%הrΒ]rύW -e]hx&gs7,6BxzxօoFMA['҉F=NGD4sTq1HPld=Q,DQ IJipqc2*;/!~x]y7D7@u邗`unn_ư-a9t_/.9tTo]r8-X{TMYtt =0AMUk}G9^UA,;Tt,"Dxl DfA\w; &`Ͱ٢x'H/jh7hM=~ ֟y[dI~fHIqC۶1Ik\)3 5Ķ']?SؠC"j_6Ÿ9؎]TTjm\D^x6ANbC ]tVUKe$,\ܺI `Qز@UӬ@B {~6caR!=A>\+܁<lW Gϸ}^w'̅dk  C 7fbU{3Se[s %'!?xL 2ڲ]>i+m^CM&WTj7ȗE!NC6P}H`k(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?ݽb+ uV4}rdM$ѢIA$;~Lvigu+]NC5ÿ nNჶT@~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pC?fE?~fjBwU&'ᚡilRї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj.~27)p|Vn60֭l$4԰vg`i{ 6uwŇctyX{>GXg&[ņzP8_ Y+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ETƖ[@AeE{0մ{M&@4Q2lE >)kre_f |Nm8p5H!jR@Aiߒ߈ۥLFTk"5l9O'ϓl5x|_®&&n]#r̥jOڧK)lsXg\{Md-% >~Ӈ/( [ycy`ðSmn_O;3=Av3LA׊onxlM?~n Θ5 ӂxzPMcVQ@ӤomY42nrQ\'"P؝J7g+#!k{paqTԫ?o?VU}aK q;T0zqaj0"2p؋9~bޏt>$AZLk;3qUlWU Ry==qޕ6ql?N/e1N2i ۓ,j|z6OSu;BKŨʐPqO K\{jDiy@}b|Z79ߜih(+PKO;!o\戔-QB EM;oH$$]?4~YrXY%Ο@oHwlXiW\ΡbN}l4VX|"0]! YcVi)@kF;'ta%*xU㔸,A|@WJfVP6`ڼ3qY.[U BTR0u$$hG$0NpF]\ݗe$?# #:001w<{{B\rhGg JGIެE.:zYrY{*2lVǻXEB6;5NE#eb3aīNLd&@yz\?))H;h\ߍ5S&(w9Z,K44|<#EkqTkOtW]﮶f=.*LD6%#-tңx%>MZ'0-bB$ !)6@I<#`L8턻r\Kuz*]}%b<$$^LJ<\HGbIqܢcZW {jfѐ6 QڣPt[:GfCN ILhbB.*IH7xʹǙMVA*J'W)@9 Ѷ6jىY* 85{pMX+]o$h{KrҎl 5sÁbNW\: "HK<bdYL_Dd)VpA@A i"j<鮗 qwc&dXV0e[g#B4x╙✑3'-i{SEȢbK6}{Ⱥi!ma0o xI0&" 9cT)0ߢ5ڦ==!LgdJΆmΉO]T"DĊKٙ@qP,i Nl:6'5R.j,&tK*iOFsk6[E__0pw=͠qj@o5iX0v\fk= ;H J/,t%Rwó^;n1z"8 P޿[V!ye]VZRԾ|“qNpѓVZD2"VN-m2do9 'H*IM}J ZaG%qn*WE^k1v3ڣjm7>ƽl' ,Τ9)%@ wl42iG.y3bBA{pR A ?IEY ?|-nz#}~f ‰dŷ=ɀ,m7VyIwGHέ 2tޞߛM{FL\#a s.3\}*=#uL#]  GE|FKi3&,ۓxmF͉lG$mN$!;ߑl5O$}D~5| 01 S?tq6cl]M[I5'ոfiҞ:Z YՑ"jyKWk^dd@U_a4/vvV qHMI{+']1m]<$*YP7g# s!8!ߐ>'4k7/KwΦθW'?~>x0_>9Hhs%y{#iUI[Gzďx7OnuKRv'm;/~n-KI`5-'YݦD-!+Y򼤙&m^YAKCtyv#GL`,Oȃ1F\$' )䉳yg=#6c+#  =J`xV,)ޖ,3~JPͪm|$oV1yU<̐t6 T m^ [IgINJ\Оf*Z"I)+>n#y 9D*A$$"^)dVQ.(rO6ӟZw_Ȣaޒu'- ^_,G;U\cAAz7EtlLuoXuA}bT2H_*kIG?S(קjhg 5EF5uKkBYx-qCfqsn[?_r=V:х@mfVg,w}QJUtesYyt7Yr+"*DtO/o۷~|hw^5wE of7cꃱ.)7.u/}tPTGc 5tW> l/`I~>|灹mQ$>N |gZ ͜IH[RNOMTq~g d0/0Љ!yB.hH׽;}VLGp3I#8'xal&Ȑc$ d7?K6xAH1H#:f _tŒ^ hgiNas*@K{7tH*t쬆Ny497ͩ KVsVokwW&4*H'\ d$]Vmr달v9dB.bq:__xW|1=6 R3y^ E#LB ZaZd1,]ןkznxtK|v+`VZ3JϧC^|/{ś}r3 >6׳oƄ%VDSWn 0,qh! E-Z%ܹpU:&&fX+EǬ.ťqpNZܗÅxjsD|[,_4EqgMƒK6f/FXJRF>i XʽAQGwG%mgo 恤hˍJ_SgskwI\t`ﶘ080ƱQŀllKX@116fqo>NrU Ѣ9*|ãeeH7.z!<7zG4p9tV|̢T`˖E ;;,tTaIUle*$!>*mBA2,gJIn_kSz)JC]?X(OPJS3.}clݨ{e!MB,cB߮4af祋,1/_xq=fBRO0P'֫-kbM6Apw,GO2}MGK'#+սE^dˋf6Y bQEz}eҏnr_ ^O^W zw~Ȳ=sXअy{E|r,8.7uO`c Nc0%Ն R C%_ EV a"҅4 |T!DdǍ- .™5,V:;[g./0 +v䤗dWF >:֓[@ QPltsHtQ$J==O!;*>ohǖVa[|E7e0ϕ9Uyzg%pg/cc6RS`HFLЩ LkJu\!`0);Sak$Vfp~C%YdE6c>1ƕ (0W4Q>@>lWN"^ X5G-nm.8B>NOI[31,j2 Ce |M>8l WIf|\q4|UkC.gr`˱Lϰ} xr.~l-ɩu_Drd31V_ѺUib0/ %IYhq ҕ  O UA!wY~ -`%Űb`\mS38W1`vOF7/.C!Pu&Jm l?Q>}O+D7 P=x@`0ʿ26a>d Bqε^a'NԋsI`Yu.7v$Rt)Ag:ݙyX|HkX cU82IP qgzkX=>׻K߉J%E92' ]qҙ%rXgs+"sc9| ]>T]"JرWBΌ-zJS-~y30G@U#=h7) ^EUB Q:>9W΀çM{?`c`uRljצXr:l`T~IQg\Ѝpgu#QH! ,/3`~eB|C1Yg~ؼ/5I7w9I}qww}U~7뭱ԏ,}e7]ukDn`jSlQ7DžHa/EU^IpYWW兹Q7WyTz|nˇ _qˍ[!;n ^b k[);ng]ȶM_u)O_xV hx h[K2kـ`b duhq[..cS'5YO@˒ӓdcY'HAKq^$8`b $1r Qz?ۧ1ZM/G+qYcYl YhD$kt_TId E$dS:֢̆ ?GЅ'JƖ'ZXO݇'kJՂU086\h%1GK(Yn% ']Q; Gd:!gI-XEmkF}:~0}4t3Qf5xd\hEB-} |q*ȃThLj'sQ %؇Gk`F;Sl\h)5؈x2Ld="KԦ:EVewN ًS9d#$*u>>I#lX9vW !&H2kVyKZt<cm^] bCD6b&>9VE7e4p +{&g߷2KY,`Wf1_ܑMYٚ'`ySc4ΔV`nI+ƳC6;җ2ct"*5S}t)eNqǪP@o`co ˎ<عLۀG\ 7۶+q|YRiĹ zm/bcK3;=,7}RqT vvFI O0]&5uKMf#pDTk6yi*cem:y0W|1u CWL;oG^\ X5.aRߦ[_Vs? Ž^A12JQ̛XL:OEUپOY>WK-uP0\8"M: /P4Qz~j3 .-8NJ|!N9/|a|>lX9T ҇t~T1=UF"t; 8-1I|2L+)WȱL˿ˍ-038D*0-)ZyT13`tTnm|Yhi+lQ&Z!֨řoҒ"HKX 6„=z{Ҍ5+P1;ڇ6UNE@Uo/>8.fgW]kY0Cgcu6/!_Ɩ} ' Ў3)X<seWfSv!ؒRKfs%(1Lhrٵ L.] s?I,HCԢ[b C-lLG+@_$c%* _jR|\:dc5u= A@kUc\ǔz;M>dUN/aFRĦ@x؂ǀ$6%}N^ \mQ!%8j0dUo=rh>*YȴU3Q,̸*E%59sTzɟڮ2kg ۱wEUD3uKrr&"B:p`\E)j<).R&#ÃecE,dp"nPS 44 Q8ZƈKnnJei+^z '3JDbSK;*uБ:hF ѹ @˿ޗ~7g9| hLXULi7.1-Qk%ƩJ4^=ple;u.6vQe UZAl *^Vif]>HUd6ƕ̽=T/se+ϙK$S`hnOcE(Tcr!:8UL | 8 !t Q7jk=nn7J0ܽ0{GGL'_So^ʮL_'s%eU+U+ȳlX6}i@djӃfb -u-w~ r}plK;ֽ=nlmuo[`wdй d:[mS%uTڪ?>={2])|Ը>U{s]^l`+ ja^9c5~nZjA|ЩJs Va[~ۗ#rri# zLdMl?6o AMҪ1Ez&I2Wwߎ|7.sW\zk﯊溺^TW^T\*6eqr/^T77WNZ7F_}-򲺺VWQ77V\_v?9?"Th $LqQjiXMlk1=VzpO֠24hf 1hi D{q:v%̈#v^nBi~MefZF >:/?Ac 1M'I`22؆DT!/j璓P åiw@wgRCsT~$U>ceއE)BI>UljO|Ty$ŋrwOtZ7$ "i 8U 7bSem'k?I+/ޓ,+_K(qf'qf1Nd3M)[vV,-D1`ZKW]2&N8kOX&H0VX$_4"X\ƏUXa\i,2 T2 eףOpaQoʪ7,GSlXx!ĩLG)Xb5Y@F+O3bqx"p̅]KHD dt1dE~|誂`!B'|tiY菒虎?1-0-zLg/)>-qRC%X9HNI={:%e+ܟa^d {itg4U2,U`~_\-A1l h` ޡ#7 |;4&" KfL-EҴ [Q~h$o˿yPx1i$9}Ni8N%CtzRNnml#44GfqWpw(\<"|wSQ̖kCC^KMZ !)w*lcP>NMtuܴu\-gjA_O<ʚӠG( ~  &1L`zӾM״, k&#uwH}+jQN7]K7'KרP%8Z FHc$ϵذՅ“{GBbѨT/aJ]ۡ'ESXjG̍̀[ޠ^bRJ!BoV,iq6Uu!~bU\zlcgqYj̻Re#>3"1WDsˑ-÷/e;nh}"Q`x5y"IfS?"#:GqDVZ-9FD *a[SIdYdV](@yDe>{sy%b-Zm{ D֤i%oJMfa `/e6 eDZ&YդHAIț''e,&BN2kR2h#w8OAoL'gRNu]Tgwd7̃jVꢚk- ֺby:Dy9Ia-&qW-el I7T׉6e}sJ'au,DN<'x_tW$RB(&QS7ĝ$T񮨼AAYsk&0EySHHu1N'GIބ$wg> Ywe [U%Bȵw_z.c[һg$TI7'|p$CqM[|w1~#_Ύ?u s[ܱXwq%DvDGl#SOCpAϢ蜺 Ќǧ$:ߎNcYGiH$X_Ogsp̪us7 )kwtgô*1LF<_Uq(}JB&7em)X*;*.VfLB&+]VPRR8œ'!خ{ RE_85>D[Ev 9"y6H |c\]`۳Aŭ Bf` J|_ rr ;BN71 #g]g"[QEpUٿI8ZbõAv䧣vmG:{+[b BxثwgF勄Bg ʛ2 Y7.ী хAiNgm@I19qͷƘNP],?u21!mM‚z` M xt&I w>:O1+K\4 wwTKqM)2vw~ex *𬐒&|wb^M44 &!Ϫ;"C 'ya 5O2F uf蔲uq`Ou* ؟i RϾ,tXOt e\ )WB O#SRqS@̡sV`};<)yۚ'R{Oς4{H%B! _?[#e"U!X 'hM] =f̊"!\uRт>c;=4myD)0-]G]wDrUq}Z&+57*Éf 2V5Qlh_Cg[JaCi 4C}* $`h=$~Tw;~:aFyH-غ@V=ݿ&f*r\MY+e@\w@e[ۀ$)l` GDԇDݣa1p <ŶԞ`ڒH*׷_:qJ- nJեȪ{<%xof[w2 *]uy alBɦyr͕AQMd;9V)HIZR-RlmWIj$i&Z?OmՅҖ]`"o*y!e]2 8)³]}5µ4KL~s { hO|8 ojx~$ζ$2mNh*I1FuA@yUc̯,XFjQ)ė([q`ۛl`}:o<@q]$[ehxE"`oKI~y .ͩuTewuMUKvC-i0^w E\UE]OW<Ex-jU{UJC1OڄkIny Z`EC<@UkAR휧R}x׭lM\ЖT忪0e9ٮ4ME`B v"OLx䙑p 1nen\&9jr!OՎtZЀ֬Lו2g" C%e^ˠ^%Fjhu:7&f>ΫMLyj[;Dk)ѥ"Q(xdyA`U:>놱L+Keߪh(RHՃwPu^T2X9]o<$E$$XU>xMNmtaJ V5 `{3zܥ!b P̱ , *\0*wrōz\[6>2ù4pk M7'g\؃]%\=uew)plOvg0B,gPc̶ec=2Dۛ! l_f'1Ɯ;q=a4ͽqܓyrW e0iرu?~3wlOF0>эy'cؙgxoL# ح{vwf!5s7$EZ0dž¤G%Ic5y")rr9},\|S2$_2D}k`^/^{<g?e'P-@KLƽ]qߧӖc/:/Fk9'\ÛE8~qvB ܞ|#}^mN[r}S+j5 a 1h;:5 gHBD~?ΨMQ&swQ*1t@C qCϤx_7 K`QR8u- " y;%e/PpO N/k_㍷= ~xB(~?رn; [5[}skxN?O$}sbPvЏ?E#PW8ƕ#_>#@>8}r,NirNOfy/,Ph]2Kcՙɨկ9\p-(z#1~10fF6:\/(A}:'bTcl{NL'G'cݒJГ8ֺF1w(Lۣ /-9^Nǃc˨d]%`s0)@z^2 8F1΂ tGrazp̸[oޜ2^G|ksq Oǎq֘0}95J;5,H0-nH7dKf۽3[#n+f'2% utEKc>?s4Nh0CC>0Gc9pDdz!ӥuı{ل9.nv0G%׭?;@ ^<(s4a9_Z!Lۧ'cs> YI-E\Z$+I>{U`);_\PN QS eQ , /Ē=UP' zwr,:/N z[{ZP)s4Yd~N< 2.)1Q1UliA/_p( jwoUR=V>]R zf{\q*A j!]\SRH&SX@aʦf-M jsKfD#ut(q!1J$M={{z0rz EFF&Qd̍LM9kz,Q`$^']M2ɨi&M˒E< ׫4 &e&#:"̃t<Yvأ*?n>т_"_|V!|g^~2 `1K2CDgB/ U6xq̈ {s6v;XŮszW;=|%|~[?Gsܚ UߐEvڞsmD4Y53̰U8DvLoq' *%i˔h'ː:*:)Lj У*-L|RMzXHmL &@K+f;)M$ vSӍ.X_K6 cԋ+#7c{IlzffN (t| =VuiV=%=A/#?th8D>L;mGI3Nf+9EW|7?-xH͙V: Kin&A|{>͊Y 6hцʕN8_~_a<7Ҷt Aҧ Jwn#ZmڔШJwmVU7V-K>mwnmZe[˶=MTl Ay[P|{A; ʷn jo!wBP-:OQPg Aݶ >MPwGA-ڂz[m/4Ao o!wBР-hO4Q`+ٸ@&\,+,(Ϫ!wh3jTT]I6bW <9!;d| eCx'R1bLkq"i!xR:Uou7+NhHg>og@eڐc w/\,lShޟy̋Lɯly+LgĦ xla!F" syr~DU6&_,D P<=NIB1p^?*Ȟyxq!aß8y&rdhV}0>-&;02RX)"[oe}t Yb^0` UJO1E7!7KYWA9Y}g.`vZ9"1٫G?· {&\&w҃2Wj<9ɒlfuۨ &7 930v6^/ϪYyΥ*\%VQ\I39B x2QC==2rqB"D` L պ[-b;"%֚8G@[k`,Pu Cj$n5`ͩ| cBp!*h*bc"#Q4`G $G\~ LLf*!z>z*]\t&DGYHvF) (:RvQPg &h++Dfk4{eO쁽Ӂcڃ*K{6eHMzm';#y,P t<['Rl+e>pއ|9wjd,02'`]AClEBoeѕ?;IR iMA,N?!g~dϱʗ`"x|F]e S#(rd)Mb^QR*H>DF\mxTG|&W)2f9V֕BC,Щ֝JֺitE-g#U0k4ZU7$ϧ%Yf4tϽ6gHF /+ a GqqTEbBniʐ~MTk5{U^ V,$-/X xDR*#BAw82$K~7+ <4n NLgՓvkuzbA^wq,}I`A,Ir)q91+3[m mE}=CcgZ/Io=#\`O5}:7MN}lA}?/}ی.p2dhͮ+y|WێA™W?|'vWĮm{Ncxk~6ymF{;4'9C۝G[/R[^CߥMF;Bi,#$ 98ڳO([OkD'|m tZt#B/6Jn1p z'=E֟Q%ςE1:!SֵZh45j RJA" ʾAwz*p:)Ҡ"2rkyѢ_%=E>vQkv6yJf&zYޕR;A_)b`?8nT !7Ďd!,\L"Kի%=EB~JG?3ڊhnZv`]iKT{" 6gdS c( jLpr`>m>2T Qi$7VH,:[÷Bg}܆PXpMKKAp!j0H=v9Ԍf>S$qJ8 NΙFX'\рH /bʒ3t`5BJEei0E҉U}Zpi*ٕlf )HH;G~Aބ"=R#+GܶȦ V"搪,1`gqN[WjaS+S$hƮ znx,Zv zӳ!%$R$M9v=Ah!xѐK 2F;I''Xps k2h=m\l"5`O(b;%QV.â(rʲIf̛E*( %nvƉq (bٍ6sM.VX 2$=Kϒ(bssC,4$2Q'7͚P$XUyG\$^PsF[]s"rSUb N0݀GBYMjo<0:scoY{6HU.@%l.ҎEҔb[K:k YpФAm,""K ʾ-Ie[o=ٍ%l+$P-\D_5d~cQTxO?I!|rrZCeo[7j( 9ymteʙQ/!D}1RP喤)j99rn88GLwr!K+17Zvf "%Kxn5MtbP~ZB6Y>‘Y~/Dnsyx ;9H{ ]A6FAVMjyF4 UX*kBx꼟{BQR=ކg't4%G! xa87!uh-d}Y}nKL0Gԑѥ$jGlt}eSCFp@$R$ԧ;3wu)/Q 'zҝw8 8Q:%rh@Cau﬐76:TF޽#E2ZW7T_r,8(ǛOc7ʅoik}x,ѠtG'0FIh$т@ߐ!煐`k6\!{-@c|/;,.F [E_Jk4c֜6CT׋eNtgqD4OnrrQ#Z0;a3xc %.+qH {e?>mYpt={b8, ^NN$XL}1ˇ/R7U # CItk\IH:s|bfm z-G7zJt"mH3tgp۰T(NzȆQ=QOfMafssG8*Gς!;'4J{aey? Q&Ԥ\g˂V~\>obhE?"ؔqJE5y; E& _+㙛\<{ ?_:t5m \qĬaǶ_eX,me txv* (#6mŗ BF}^w_Yne0 !^TI7ҼO,8yze_FHKΝT&Bb(Ӫ]̽C '7*h(MYVsWap 2{E%G#)-EfqZs ̫b L"J\BTYXݤ>V RIbD넄_;Y al13]T:fULgns|(Mou,8:n/W,pF}Q ^=}{{v\^.7N68Kuaԇ}6T>T,^;DMxI!cP^騐&8>$aIj0& vaMF,v!eYz=aQ=wW J3{͒c?\LȁH=E Qi)O/*|4l~˦?ncN|Nfwn+!C`=ob+`yȺ/{TRE3{B:/*Xۛ0l%?{ֹ# )#OO# ^\T+!*p^\V-u|O^NpnQKP\/} zS|'3h}Zպ`x8A e;)X]09yx3H?A=y룽}~}2&0Y,I\PCѢOfb 6"-H:]>o9h'`4($9tS2#Y`LJL2$Pi&NnyE5 ;Hv%Q|$.ʢrfs8Mlq`s۠Kb$f*Xa%ے(6 !ͰV*(&G)M$~Y7We-ԩIZ˂9h2£hRO,K$3f77[z(԰,|1vJ}cvgSɋЬRf`T^he&HBIvg4x-@*- R8$V4="٦XFSİ얒s$%a񘤲Ł^<56qdS(:(I291 ,n΢UvTAH|{KkJυŒɳ&ohf,xᬶAP]MIoozNl\KZЛLidS%R$xn89QTxbq)jt ҡ,"t |XpM #KqeDҙCA kOɍ9 xMl:n*vyO=>,N8iq4-.v:?:qJsY7YrbE(KrQN~a2>¼Gb1wN;Qc8oRkYpsCBke J j$+k#ZH:# ЛCۦDm_-'OD߄EG\и*[(М9$N^#H;FQHC llĊ FL.\96̞>s-J#&/>*W<,}{enc3ur㍜rP—>w7yѹs$9'.jUYp'\Ի/f#ϣ+8j̾Q;#9V\ z86~>Ϡ梂/! .԰ɏԲ2C3yV5@%dvp5~X.? iBW1 0 L6;rG+»5<]ƌښ˚uAf,oh)!9S>.8pt&y- ^Q> L1oq\g˂k&uT ݋ /3 Ed%Zr?AФJA&FAMޥ gbFQM1]ysF*(Vm^cR)˛um⧲sT``akҎ=P)$%0uO_331n16;|vnr }biwwSOzTN,sUQ%-OL))SBK^Rf+,zρ`Sc]9}{{]k'8t%%ɛT,w4M5 ċu{y0{#r!86hno}3QҩU*ː7QeuN?w #ߴ,!K7X8șeMS軘e7 (Oa]X>2w=1v3GBkws\WGAͺ^ݥ=J6s==ؐM`M~=Jꅅv@ql/Kr+C^NUE;VSPV.NA5 >O)8+[!e/Hƙ'^70YGp]7"TwEyq6#={ Kne4]VՍGG䋋ȩA]Fme[Wu[ NQc7^/Ak.v(f~q<\ ?ǍSGn ~g\wpB|# `G6kBzFF-Z,R:7`˛*(K@v*8JJy xx<[=+zO\H'׳m*x#z9c-HvOte?k7| {\QpC[3^]=ܝ[!48[2^e]}Y7u]H 1-t*sh41k "v |>yS~=޻N'yԅ'9Vc]O2>)OK#׎Ϳtz lxE ?C~8 $#rd3#x?'/=x#/ocU0\<2n%^f8+FWw`v߃a_<VQ>( Z0PخUVq-#"*g -[Y^#'ނ] ʔ=>xB@ot]tͲygYOK;磽yEu7jBYW{02s5W)yW;`=&b35zLƩ+B]]DfѰ\Vo-ۖ*u@nR W.D+".ȵ4#eyet:ʠp%@_JM\8 pot3 ŀW\+UYX[Z|]j`bWg&;i)xE|a4]hifBˍKk&nmu)v}SrTJ"NUoZbcmű]5T(D;E~ q84K|׵#VKUիz، 8? ;?"eO4˒2k/Yh4F0r]xTymqZre>GRhw߽ NsB ߛ+G5o1PI:7sKљ#w%o = -AL.@uxG0#! z\ nB[hғ-o71wq@1A@5y~; Bq`IUwTEwT֔u@h FU,{6MhŒvR-Cz=140̻$ YFֻ2"D(!i`Du1A؎}9F. u,~uRp^#K% wb)eV3zIiRi A)뎶w+ʐuI$ƴc2)|70Ф[a7wiLDƢK+Z'7_Rkr?| Ȭ+e)#OFc^+u#HLLd)u(nOOx0u Q9`g=аA @W.po 7|$PO\ /6K7GO^d]tvg| y{zjq>=vI TXne4Fq4RGRJx[:wk5thl]G1$%&T$fbzhr΁/ /ֵ|?/vn+vh^o:(͓HH-CIap"Wsh>Xƺƶ&#%0tx۷r-ƚ6kJǒ]rni'PmZ |7'|VO有qJx)cp++TLɥqGE_#Z#{ tڠɎiqp[έ9vf9@9EjTL-خzb{Lg"(B\*N,~g{&-%[@eMc=Բ!ULVN=]\;oUZϲ01V6<źk{IyyVMk7cdb/.^Wid6IfR4^߷TهM]=vTDˎ'*8U;Ϝ';Ϯ{=NhNO;8UhZYGnvI2Ԗ|zrupȎ֓Qvc=ZpqWVe-\W Xv_Ns=mMdxjkޞ4O'?M}gr7в)gyOTsS U)N!z~cۑ?o?#[ϗ>'M]]bU vI`\l9diͣI @L mS<{Of-/wdRy;{Xt ޑ+HWZ Hт=` &gxPpԠUkUq`ZSBXmxt#q!  @^IٽRzb(ҝnw|<gYKsʇQ&q1¡4\ CEI"1%ƚ$.Ș$)|}>/5t/iٴ#(;6!@Dg,4nyPK'X2օU!dRK AmYA)"&!a(H2d2FZƒѴFѴkA fӥ#`-®a#J,a,Rz2DDHDTq"UЈpMb75?\2w}X$cI阑$41 ,CA bc1 _a$c#.Ŧ5~ ;p,E9s]yk 3NXDCu0D,Ć$bi4cM14v!`+m" [0kQaִK@0J&"=(e?D;;,?^^>>q~^ŷIj[[s6Syz~5H#<  =|=X]qxl /NbsO=hRPvb`'!S`:vsՙt#W" pd٫`zއB-,Rz L0Ny  3 fb>/GiOT-~`6#/vPmBt}/G)?fGFW?ظ_z^OfGOpn{ڎ4 VZћe?gfz? WM_uY@b||1cz1N#/:S'pad '}å} Tg'X*ƉF^>>k&p9vf<8Dfjc=P=nj d %#F}뻔bոŻѴkWvR(Y~ҕ]iJ%nAA[qZ̭OH(.+1`=J;<./B@=I ̓B(1a}NV'SlM< |c|iy>'p%Ë b#|ė;#R]I/_*2wR~ܣws|1Ոpn4PJ12_,\oMv&5$ۃoy^xu36IZ9FPq]溙Ml:M0zve 6iV+y1P Dހ؋ZN7*HVvlmrb!ω-cvbzC%&>KAoͩ|B%Xښ{/Gu/%=+ O߻L~ ÛL>d`F^ѺOF\seFEjQhl>?I fLBXw\ُ1 ]Vp]~X^afbnʁB L9u d^<:Olt f./W)x{7PcûCUOL!ScE>3Y`=3YJ 3Hpfrs hg&7dUdI 9!m&gTQ>')- $TT*M"4J&7SK-g1e:%"dE&^sWF1LH9Ӛ )Q1:=9O_"YlF8{'rtG0GNBu;QTJ}NݶvhY)-9V!V&H%LEJZ7H%6$00jC-6H.-kƼoAON?KOQݙL1܊[;s84[A+U 103: ӆ3 B`W,!Vr)4I$QܱBG潗UJc<$ihWѥLr 擌V8A4quo6AH ca85D-qHh91Jiɸxljj:$ 6HŔb5cL48T %!$#BJU&j Pƀjh"#̚$Ƅ҄ILc-#%L@9"Ґ&BegG$n08A a"Ŋ8 ln2٩e1Ds~c)Y\I; v'%i B9 pw2Pt$T=,pdp|y GF;O^-"Υi{-ĸ"S~-&>a ^+k^ɩlKE >R|$A:0ZaF%!&b0YdBi0 'qJ +N;R pr4nZd*KM]PܽoS+YԂeJ+mI_0H6o7v$Y,`#֋lytd3HZ٤SLE]]]]]Gc\#r]apkPN $e#Wq'` =V=ifEUHXXoDn.Qa(ZȘ,ށUtZ0hJ$$s(+J,ߑ|㹌Q D) Dz9phImc3)H}[t2y%4p&aV5x&fLGTRʝe8P2y-AA>) ]Φy|v_ew'Y)_,/΢,ήP lf=CB`]퍷 g} vpe{zAyuW^͸pл6qp0Z@>\ͮ0I5(v^0ֿ(;Ri`8d# SiA1oӼߠYڻIRK$y;mxr&wD J=O>PR` |Py4ZJ֫4#֣'q^qUOp?n Jc ``rn@{XFXюXClⵤǷD2bGR5X-բF*qr.T'h&0IjsM* nx c(P4ROT+Y*}H"ci@0Oϓ:S2\P؜D.┌lGRJZHʈgIza%0!*F%i1HeW@Nc]%kd_9D+hA['n0,uڎm=-r͹ԑlACٗeMiP<J8¨t~["9$V0Mq˨6N7J$wZXJ V1M1H.@!"J`D=*#ocpv9X, Ax=XHr27E%XcԱG+ͤ8yX3i܃mjdAmz n`T*y\Jc B!'iUtp9Ul`QQ *5kU1t\.ng} 2CԐB? 0bş|o^]mEY]%8?+tb0M|^2 k*6|])o޼8y{JprxǿߏEG5 %p禡@%2c:viS#X[띩vpu>)k=6Ɏn!#K^`v{~a XɪGrKpu{!Z9Thn#wށa t[~nWLC&{jWIrz,5-enRb8zc)cH 'XT%|n瘪!Z5ddRu›4+609gܹP;FffYgix C7A+P*񵂩yTF!\"ДbzCN.vDfQ94JKhAҼpuV %LXGw}*u[xk{ Jh#>O1aV0-/^_Ejz:o:gݧs7*ԃy y8phoyni8zfHЩn iՈ,60Q\<_vpNR @E(G$`%(!Ebn2z'!,  M 3 ,J5(Df3 =6va ADpctv#V0%!e-7[•ӕF4e hFу~/'n>o]K<+=| OQo3Lzm׿2PoNFá0y)W|\NH>c@W8EBYȮQ(|r- k$EySxOO)̴51*.(7k rjZgG+_nY0o}_QߢHye7|_}'O݊ WA E />?)_ F -Og8x1AH̯`M8t9pB-gpd%!*o aPdrq6.ƿ뷯H7*|㬾D@v # :oXЎӻ˻o/0SNK-W~*\Jq~y.Ɠ?^r>-hlS:W*(w}O[ E00^bv.ZA/qf=0k!S6kq"6U(o[S>wJgs#s߆ '+*ZFaRJX:G =ۥTd`W|un|&ކޖ-OZVfуh1[R!;|{8gTp*<[^2JQ1~t@`.瓫URkdM`v҃3O dnwd;6zir=hzxSl9R׬9 )GKR^p c)^r(0nxO?qߢ-׏՝=UN]VX .ŻwF+ԣ ޯND*O+~>;6^_$])o/z[o2mb^-@wYEVlvt;s/Ƨ03`j^Φ_P*JhADcϝ#Jh ,`^K1l2 ۜB@5]ؐIs̈́ ƁX 1 $ ߤˏ@ʘGhUۤz2'0=!j AgZ{8_!q7m! ,`$1ÌZ&RsI!eYl)b7qֽT׹} u0=A]-)̜u!qDKk2"X/>m\'S$o@<6)1qsfU▕X9P՚#αTmVw W=A0^ׂ{7f]Rrabٳ|jHF͋I$bP V)/gkf8 A"bF1 0jrƫ@ Ȏ  I6dyE0EqE&@+/uZ)4VوP/@t[ Xk搦P3 #LQb(M4ue5 Hty:bJ{q:2UW}( -*=``j>pLq 'acR (ЃcF<@ $9-+d^0@T. U e:?lq@[) /. D,/x:7Vdlh e,XN~d} yrx*fE 8-`9%^V0N+<`y'SaA[w56}m6F84=nbTnDl \*@QRhL)ڽX|,#m60Ze 5ܳFrqנ\U8/ڒ?샩xYk}(ѫ8`2 G`93#0²B3 _XD{/0""A |T)k2zJ&`ٶ\#­_v?`ksڌ!$bU_.u _v)&d,¬"H"b á Kq9,Xc.Q튄ީED8ØcնJ8nVrYV1 R̓;E/kP[t͖˗[36amyl LtRh9\?ߡ<8]B,U!\mN|(/pl6*rNvVv{N~-v' ?Mn Y륋ŒN&KZ$2GXrKy<%(`_bI )9bGחE4_pŽuC'Zw쵱_5kˋn̞IcrQZ!; %~lwğ7¯7GO1j ۟*D{U#Tc !ӑ?;9q!>¸X{Vb)T]̻ ox +u%_ؠVa1D$&6IM"miHD$&6IM"miHD$&6IM"miHD$&6IM"miHD$&6IM"miHD$&6IM"i+ֈahD\ŏEۯ]ݬ(I³=:$m@t(EYވF7"HoDz#ވF7"HoDz#ވF7"HoDz#ވF7"HoDz#ވF7"HoDz#ވF7"鍎Xo7F0؛ۏFokFrC7Ib؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}YM\Fk>[gاˣ/؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}Is{o YZoݲnږC]/\vi;K'{v^^y|}vPry:yۢ˷٫3گ.{g},4FĖ9rݭ a2?Շ4{D Ҟ텖m=Ѓ<[cUǩ|]dx";nx{vq:*R緙_vY`}e2v*xkT$)_C “> y}*dV.9'. qr^NpVx#CV+SA $Xߚ& u0Եt WS.~%-ه_JB.,?N+s$|qbk }Ő ;1IY/b7紵3dAPqL={\}0v;s 9 YT}$ja( xkcEK#yqico,Ң+%3oymeU\7}6=[AsV5pZKkLfu #e$EyR}ݽ/AFk-1[գw8aӛA ?i8 ?zgi;Z>qI>hJVB޾!ڼlcDzyɋͥlEAA/EnbTSdoMt ^];.:D٩-S^(ꇷ۴͛WK&WZ]o^L^mJ^-x>16{m}pS.JfrrYw#i_g%׫?n^z7O4 &[:_=R JߣnV^/ -&.LN'Ox7U~6σޱf;Ն{u6,;嬛[eASޝM[+\ںeӫ뛲fW_ӌeWGL'8 s$O&W7ItCӷه-?-c\>orѶ~vulW]nr՘偡ּg,~?;zWځMn,7e;ɗ}yU=F|C{އjz1dŕm~٭w vku8 wuy?B-x ~T~OʂoтJJ"!t dLXx }Hzµmx-96FBj k2f:6}/ЧX O=2W9fת%=9?)O{Rx2.wOTl?sZcg`cш3a;ȯ\ k?k]<ٿ8}鐡?}+໇ꅖ|;_.zG/ݑ9Y|>'OCӮ[궕hwdKm?J{>"E(XGi1@P@bw%Ic= }V`Nʵ}Zqʚ޵$bS>} ؗ$@,*Ӣ(˯O )Q#cpB88,WWi驮[WBoc +hhLzNmTX?#fٿb}=ַ'%v y;COuuuu-j U&[ld?vKyַ Pnkk/{n^3}Yb9op!{No2:MHO >pB tAwþy~]-WpR |DRnbs "欵 ύ6TK8S-7qо.6Ǚ[oKIAG6`iQy'#ܮ!bA}׵ja}8Tɟ>Vw?`ií7]uqV0J&]YZGnF˟}j6y_!WRD!ԚHdrZoI-ёcx ǑO\j?A!ΚFJsPeS@΋B}D.|*:C-#P4$m!$M`x%8wMZ@&|"gS_vNxቦ{7{bd,ʧS r 8z^Ysm7[8|5E?Xw:Ѫ# P(j!rbHeU,pJV^gL`ȧgWB`Lq3%zx~<M"2!A#eI czK6J Pʼn9o2Z 3 rD(hzh:N wA[FY$S0g,| m=4|*ZW5PX+d~ROL8e'ќF(x>_xޯc Pg@H.l WZS2 o 8xNzX 1~3-{3*nU=rpSG'A@4Wxg?YG> gC͚[ǯe;­IR) ("yJZK) ct^G֠ȧB.jѹq$~\lw#l50qUks5F#E " fE[mFx,Z0VF#,8kbYR^Td@cA{,(ۊJg?jADFH0kL݉,ꁊ#*c;cRRM7C䂂$;G"|*¶FmQ&7/|$ kCxW$Dt!HɀG1>!Gj%Sxv8 >ל{Ō1'P#FOZ> u2E' (%)z-L޾}ybU8~?a6.o5xN:Js|u}XM3^1O$ X QY[Jqr#DQa ,TbNV<w-91_.3nYCHU/r/\%IwD$A;PI<2=fJTPi"%|aG_VeT{{#&A%xB{/D % 5<¬7PSalbH9Ѳ|O76` 1oQe] EJ gEzм, r0^M::k<#)qK\lW=0aT`=8<٥k JW:.!3b}@O +Vn_n-ohU-u:߬wHZlP$E0Xp8֛b)c-b]ZThl& .vTSRU;M8K 2E1zȧL3j|))bB$!<K!8DP~痷K»:٣+]׷<-C }o V TM= |Ѳ0δI z ^_39cFۻ^KCFE4r;OT$(X<@ `6E*MiQSR=}O_v#/`m1΋L+ЊiiTC<8oeY[2Xjo]`c8nrWo:˾᜖cK.CJ87Mq6_w,}1 \aQ4K9w HA\pv'*<]{,QV3$Wt-bu?]" }wm.WI 2_Vk7\c^q/Ma-ײbAXZh7< yOh1:E>TE,zf@? ߦYNλL.@pojooz 6$'NK3y0T" sŁfzG0b89 tQ*ZߌrPm>)ORvdFiDA1؁v'Bn*6k0 3X Pۊ_Ji5st7n92i-~Ey11[8 (%-%k^/Fg>apof$lG)Ԇ|~ڍhYRY’sq@W}ȧB^rrw%g5P AR:9KTREUR"{QSwn=1-xxc(b.a"笏CKCPb\Q3d2 qxW~cՈl2{ꈶ)$!B%R<3A1!OȢ'B.*Oz/~ W|"h1`ʼVѲz|;띃R8po>2wN@ιX0~F:}}E>[3_>$A-ogUqe79gnڛVLݞM^\QAOYL=$LM"O}?Bg\@K*\һBrD.H ~ƍ|_lcȧB;&_$"u@Fթð|# `,/N cHpђGq9|L;PvbA] vM5uCb8[tSѶI)v/R4WwuӶ{X̤d+i3{HHբ 6fo9H:AdJ>dyYgŢ =~E>]ӎsR0,r7cȧzDHH\x" \^مCd d%`vJVvG^k&.](Wn|Q=p5g#Mxܯ#3V*~ =gLcR (dv,o/E5fL<Ә\/pu@F\zkt64&T׼p ]! Kw;A|rʊ.P7x½\ =oǖ7/nrR>V9YՇ{W `mv:r/ fӹ8f(kc~nv,nַ['[4j\_~Znp¥YC~QSuMU5fGFKq7pNU$y!Evq.(X$nS 4fGU*UXBd$0F3+ɢXҽfX-LQ)搨!Ͽr Z-_ {4u}& W\m7zw`& U(ztmlQ3牚sߓPq̪D<3xf_Ezی} `{dm[^n5qj|:B<p 3i q-5<&1҉/ tƉ)_n>mE~O !4i}P(inmٓh7*[BEx4`r.X[8x℻NǢ,fttkE'~]|!f5YTSЋf4.8O8KVR.LaBG(vM0nO!ױ>V-,5F9tO!bPg2CֳB">?c'#2w~zRY6ϫe>%nn]B<IIH@> >Qd;^SqFHhj/|G\1ZYW\>y>vg߄Dd4Q8@9AcKTӘ(`E #QI\)ıH Zn2òs5£`Z#iZڇgvڇ4Yl]B<? Zp>YT2&$pֻ] F~s| *.ta;Ԗifr}))SאMg]l|k8e(H[b_;ɎdZs^rıYܛv&pTB|`BF"LuH?q5GAGHc]jB3 ^Xq1N)PgNp\Ew݃ފ\Ns!PN/:65_@Olز($tIt2Cn8Oᕫ0rnH9pmVҽZl'՝M?-j'$1{uEC>:t7Jsŷ-Sc(?8)b!T~䓖ݸp?مؿ#8nYvY8cٟ,D*%`WӘ#Z AIOg~#CU6ۚ*KuX<dR7U5QZ3NT5O2 jmv>K"qlx4mq~#1Y99VHrnuUt%0 ^z8YOE!;whâMy踻3$0:VӀAmU^^k|sXIQnPӅȂ4*F[l6k-Ň?~ɣ܋k!$ⳗzV|.2 ;7H\YPD+>c,vZ&vȂ{p=5uuyo}/AU}LO:AnlO?`g<[ĭeL v˰u+i\?J&ȰMhE/r"b"dm럪_c>pFΐ{r)gƢ [ */TC=WHMh'uRrr+"TԠ2{7<ۓ/\r1ֻ4;8*sZ9a% D*ևy4"9\l+$:pc *:9dag\Y1Ѐ*uc6R<֓*\^Ѓ~0{[lW&jUUAi'}ҿTV}/wp j739dag<,dr~4H9_Ȥ!B$5r[n"jBمPߩ+Tu8 2=%YIe"SfQY<+%uaF{c"<1L(_ѝ Pc5g=dһ;y>@t.t! #?SJvJ"U$|z84+"I¦:q*O?*>+ Zpzo\!kʊi z)T,Է_U6綅>ה] XhQ[4A&"q,T$b&nXoX9ǘq)*0*eWͻ* ݥXlq¢T9P(X\+)U$C< =z75N}mONS [n= >U8zU͎`)w7?phѭN7 :*z*,A@@wplx#D QlbpQcQrd|nZGqhʠ2,\(9?DcI]`f3K.J5("J/E^۪X8{_. BjLE yZ p;}ԅPygp@laeVfSd9|i͠ߩ_D`_|*pFX3/B}m"ň P7k7y 'HۢӤ\L_.ӠumyBdId%OF5ʨ. 㴇1'qF}> Ys? Pą?I^L-a<K !C2c_<.,qpr〴[J7jˍaH}\CH]0.N9|Ox%9%_yZWy~u h "EC:uy}zL˦di+:]3xh5YU*`'5PU}gϷhE]# " ""ZMcDbP%bmXx]Xsx7.ggCO?KN_ǁX4F+<[[_lo=^[n~?8 T) Fs7ߞP!JIrTh]mbҔwUu[8^m~1t1JIq! :p ͖5~- b7[NcD !;ftlvO[?[(=lbK JcDh}/ Hf0R&dƷhãX#)"gv}IIHLCRlլGUwկ0L8e.]1 ~m1)t"SP @P,x|( !eMB9NƿmhqFn\Bj 3E&K5Ȇ a ψQH^-> )/Vy9}{}r=̊:s?߯!k]7>B [VmVEa~Ӵ],=aceA1CɦEasӿG7$fzv$9më?OΤzʣMa]1z>\-OP:SyCXz~5O i8Ϸ}Zu׀|{RF0e]^[*+is];NS]`SaVGY cpJ@f>ug;IMOiҥ@]I7xߕͬC~wNe6}[J7}N )ql=wX !Kl``ښXRDpYa;?hf`& oCW+Bklks[GI֖?ؙ) )s Kvshy@_Ͷ nVI}`L[[?k;@suKB?Qqn .SClYֳR@_mB?gBŒRՌH*wV"M}ʐ<x;OurZl#s ȻT Sd? <<7 Ѣxp{_gp vDۈG>y0lʛLWV0q*(hNLqnGŻot aC ^6ݸ )J^^tPڮ?֧s=k4rt1"1+?`j[Q1_MZ]U6*,9;Qb-C',ׯgF]8B7L*]SSbk_ۺC9̫oNR ßf7_+9Oۼxop%y$fNj֯=!wc pmO'?I:a~RQp>"D#Aj,"!H.r> $*epksr{#zO&$rLЬ@邰Snn(Q|eWB!)6~"AZQ_zH@N &g =6+'6!>b3l9GK+lc9Ky[òz5]Y쩗|9ll;VxhE`:~\@zݸ*(+SOI4"sj)&loYO' Yjb>Ybqo̰ޮG x直C<Ϊu,aGNYX7W?MAڭzn_j姰_h ogb^Vc^V_Ai\"3$| {jY`Sf.fGuPOE|fG_V,NA''86p0}(ՌDj+8Lqç@r?XtTp* dD71ǜ x|>_563!]5n\dLN b+EA *BFC|1i>H ⧳/9/56^(0}q6!TVR$ i§qTK,?:%>d"N1^711D( YA|ψ h](s?sAX'SX>4[}<[Cs!2Dsۻ eE9=TC>qɃ΁LJ_=k: @U S~lM=u a`Sg8GHo5ɝݸFqRI}HwQ|jAY?i.!4%&YM& YH 8gS~ β>~.`m9ЌG_up~ ME'1D<`8j~,r81 z:?0h?df*'33Aa6I; *By;n|N ߡ꣯ݻXG7 AB+OZ=Sqc91pda9*ߗZ XSs Y+%,&Z=Se# +{[ GMrf=iөu9YG*]3sYrE{~S``.PMIXrSLoy+pd%SN%P80(X;mitXN駕{di2C!Wh 8rZo=쯝FGÌIT:,;1_9yB'W(SQ9y"_$Yi G,J/hfբrK㏕GxnUn:4g ' `!|.:[FVv:a cIKa`LLOQLm 3/;pq#OSV2ˉG }\UUVUaf9rxh_)at(q*jn/lڪ1!BPlr >z+ $Z4s'2GBy΂ZN~5=AREO1X(dl{,Ѳ8$vݬ-`X|*R'93 +oqvX#cvc\h2p63,BbXIi<Z# !a:j̹0l(lLNs~Cq" Ä%xf&@rRЎG#7K~ zBժB^+:{/Q>OP^A_sʼn >PHws Ev*ǞsQY/"]C 稀c3<`J7^ (4PMGF UC=OjԷJib*"`GP)s rGHB(>_ yL#@bhC~([VA1@8c+$%Dj)%I aLT B0OZf@ex>9bR49:MH*wkSd@4q- ՠEّ0@[}݆obORqe3]buyg&6>88,2]#kU*̙c"0\ A$ Q.i4hd=,x^_=SNYaޟ&$XǑ VOGVG0i~6=K:'nv\)|&8{T "s{Zk&K!voFqndZH3# Kx!B0,Ij}[}Zg46<@ҽ4ad9§Y.G_6G~EBX>5X 0<*qd]Y8"YA7f3I:c\?2h88Ĉ$IaQ2Xy9C*PEOFnXY8tA,8E*c\#8:DVli:;]2zFm2z>w/ y i{ǹ{44`tMLqbQo MUi/&WkWr&nY6OS> & ssV;ė}$p GHC<YDT-.Pa pB:\coc9_W)ѹDՄsKqIaysT"x'l(\ݑw{oMZohfV옔H$fdʦ/”^hs grr&59n+ypX>\rZ0/%gqxp 7j$:;t, YE nۢ9³EW&~)~j! `XbV#""jGu )TR$0 v!~r{g;1kƖfV!P=A96 F(u(C؄(2XM BX@E7{mO9`ad$0d_ֆQ,B=URj(t;>ƒ ċhз;㮍p^Hp^R]2 $5? F2yY W!iی>;C˷ eF1wŰ b+) lQ`3֌lMvV-9< 2Әb"^T$L(B#9'yyYMnͷ/]|na$hia4~0e^ Skq.% #1 BR6 W -k04N-RO;KYBLG3ʥ\R㭜p)IQ2=i^QB5`ʉ60xRCL+rgƆjL{KxWƓ?xƜ`lkS!I|ЈaBk]p@eywtgDLoAĤ[:HTsZ4)6}_KխMxK&qDq Je%olaQA\K,V# i1}5kUqXVMiպY ʎiɤP/i?Nv3⥌ /4Z:nmٵ#ÊӎhҪDܙY0085DJKnKi|L^{f Seo[isK&aKK&6Qv ϥКĺռqB˚)w'Rg͠ Xt\smB'a=b;cOOOd:;aۧq 2ϕwgP!Sۮ6Y.ZY:G{)(*lDN%R.$Ab| i[\TvҏF_V,OiUѾ4X"pҋ&,V4:ueFMߝ=%%4lsU䁪uvϦ}0e~^yIP mC j~)=.h4B##N[UPB auA0/`_ =.)㞂HX{Kx$hc 4#jRYEa4~Vd݌)c,ȘE^lL4LI)d6Ξ:$,g#3iD }tƸGdVX^ۗ))=6"ʘ`FKi6$6$L⽘,qJ3d28c3`!8tH›-N!65̨Xh\zRsm'<9 B)CZ (7:aZQz2E?yM|gz,Wl"3 yz̽k`w>fAu+1N$0$֎AKg%yZXmFq!tu+T'2MW=[[q7zϷ~؛BFID0^o+J:݋itek1[~~+w/E}>^2JJv,5ֱbmʗubr׸_k5E9khQtz v7cnb)ɇ?Ch 'ݏ &ctǿ|K~Ftr GX53u헎-9-,8P5vr6J6狪ѹ1Rsrd<Ur4{LNFEXul'are'F]1@1`Q'-NT A~$Fߣjѳrσ]ܭ7QJ`z>h ,>8KA{YdvyE~LLYzM// Y/^T@,DC4]B1&L b~uƄ\*aeeJ3IFZE^zutXm*8E 4_Go |ӛfk6>`y oх!HNH 栀7Fa~.(  FIÖ}O֖_3DA>2).yП?0 @`/Q?F|0GYpJ4Q]vRĖ۸YX?D{p\PDQ2Sx6>p7xXbe*URanv [g%"D 8h3oQ~ D*zn1E[ٳ,2gXԈ!h$Tj lΡp.r-c,2x`tq v bT(.Ej+ 'Eyl?KLYMwG UTl|۪лXV7W~+=!c0l%Օvh6y"bѯWW`u{}v@+0bY #)\@Q+su AZ1kFՐKFn?v5#m0_X~S^ vCHPʖ9ѧ*2~f2 *l |Y%DCC}øIC͏Ns(0)P㌜~Imc+u w:]$-cQHv6O-Qэkb$6u΢K!8Vmrо > Y-0f=.HbA)jP N; p^oN?56< _~a#l,;}@O8O:YxmbX@K w6ybU!G#9I 8WۋN~ ƥ n/dY.ֲ 7'rz&k\I^2%މq+YLi\~>^m;rLtt}/^ˀbÈ\">$/n[\|i+rlb!L n0RPpQ6o[lxnۆBB%3sѝ:%P*UT[#oq)"K!e$e7;} euq؟\f6?gRiR' _4+h?{_WDBN) ']Gf_e=> V)U{Ж1a5 CD#Չj1QH9fNA/EF`:Oi~,} ,sӰLHp:HMfYdŧYg+G\ÎS诂 H 3}< n>L|sIYFrxz~5=:F524߃VI)s[Ƹ9#>A^j$uޡĸ#θ/*ޗ$I-cU)fd?@B[^B%4܋*<}ı@ W+N*2?O_吕\K\QK,JT,αH_i{ ̥˫2210z~rWEwDUwW@f5͉[7;I3Ю|HnNL=i!!fSJz^{c k cRι3?9=|E3D>nZMɺ|{0/b/huAxwG!&P˜VZ7MDS:5mm c3l۝fy+q F0Qn\7}vÖw>`U1,xJ̼c _ ZTUI;ޕwxCZ;)7:aX֪vnVlÏ bt2?LRhI|rbg |"8'00 ΀+ E( %B4>x{*0GPT PCWڻ}=X&ut9ӺLJ0CvH!GdV/і1͙?Չ}ݐV˃B'J( 9؜R[0K-!剫2b~l͡ OEt8F!^,޵#E0Iav.bR-; [iw[m;q.Wb3 V1c-H\H@<-''fւe3 BZg1#3$#} $;^+)c@NX@6&d\)Y4nU6Eh 9)x-$I$ 1#\Rλ%Ŵ5.i\9 DrNQ"H`}vֵ~;&k?_\ CSI'݁9a(@8$C8\N/Z!acFRyY/#"B =Nzt`#7XNPcd#Lz MKA^  @Q 5[*pBZ3XB4&] SCcuL/ھY˸Y6}k`j)ctLIc0%@lh-M4{DŽ[=cnR .GPf;g$Rԧc20xu䏩lp! D<ntDVh3c @J80FѤSMvkd8ˎWnFVeL֥Յj 6(ޜe)ՊSF8։h`jecPV^H8ŐJ,b3zR%-W,Z*6 FQ` 4h]( ZaF'l!uQ7֞`A Ԝh) :*I-V߬9͍ҙ|)gpa&{ ̴(\@Z2˵(}hɣםUɐ ke~{v@N3:#WI+9 `%v.By֬~]fTE g*1ÿ}Ӯ`x0% 1i(vH'FL2(!~F>F58- Vi(& D*-e~dH퇶QctL0/J3U4FhQ٨. SHe򋛛Ih 1:&N 1,,tHW NyZ~Ƣ|ۀLLG& A t@8l/[R1qZwfBK95;T:& ?ޛ մɸ{7OrR}oYz9dNɁ1:&>>SN.AnQT\QTNybHѧ !LK~sddzX*-',nw19n'דl1A4}5a3%KQP GF"xm$&(fAZzC E)Sn놘jcJ9R}'a IM/ߠBB9[00X5e<Qm{ӐDZvL,V ^5h>Ibr`ԳJ z)*|$eyTLaİ,sۓOki˞^_pO Q8H=@I;5 !218 ,d\( -V& 1qxo-2qhTˈɐe5QnX11Z XձX+ ٬V'k#_'Ն}=L4yiJSZfOB2a.y>[,hTA5nN#R9 U88H^F{哤˞9:Q{<.J(#ԛ1u럷Pv ,RMl\-zwn$p魾k_ށ1:&q6R49t`?k\Xy y`ZȦ6`!\pv>X XIKڙh.(GC7lMjenw194yκ|z7' RdcP)+jE{+]A$UB:LOIs/=%!&֦&t* D=" hէ` F!;q;c־wSi[8LLN,K4zFRz1:&n27N.N$L.x'T0p;0FϨ-!\%i*uzC޺SۯfRkGgpV qRju΀ʹx˃$>*4 ˵}4$Px0x U&c⬺}ݗJݛmbJw7[myly1SI"Nˤv~J{"k ?{a=FZ_'a5DZ%tϰ(fi1esЌwjRux6y8s6Z؆Oyl4iEA1@ßmʦQV ~ozu0kP VnE$yOp>Hڐ`]0.Wlp$>[ɯUga]Ī35i{m8ps6vpނ6-Y\LBxiN"NkPN^с1#uk4WClj@ot`搿q;'P}'3ʁ9Iȩh~|\7&|oa_g1}м&v d.fptْ@;7˵;WLH lp 0rP鬜wV)xCz/Gm%EMop9=',?-xeɖ_U~W!œ0_\z )gӼs`͕ a4>4P# Jà)7yhNc/K!F2z W"!X5)Ez`5`bHpq 9XywQI%m=LY—!",~r3h.ޒ>a03~p{A z M `%=e狈ŃK?_ͯm۞H5Ӱi?RP;j7rY1ZcPB{K9Kər]"#`fHԀv4<gUn# iA R9*&bW Zn.F)3'g~jŝ&"ZeA"6ƍ Z86*C2If@æ ߫c^9O2HjO;+wϴgxd[>&GkK+^ϮIF!Ef4AbdHZ!Oz@ΞYNBѳPN2˛K׵^mdJT#6l' L%kۆK2hEVU(OZ%iIEM[tHJ*Z/J6h-: p_f0'k#}nM `ZТaSra*h>@E{UΠ'SJ1gUT]$82_hiV]4vҚNhSׂrKg:ٞ+CQiǡ;m)pBlit6^rhxѩ[3exf'vKBnpQJf7(Lˊ2,o5An<<^d $.U+pH2BBݗv &8-}-d(8 .jÐe I0௠?eV*X{:yVxא?;:[ل=Kk$;oۢ=+StZ97N''Ⓟ yx*ޤ`*/7J_'r+v'se}D?v,Qv/X2]}vŲzM\pQf^}!յMǐ25/2VF%Ccx1R1+D[nY0WEj CBuQ+Gv4,Ww:L`@+OC]Wwե<Uw+ ` RK^\`E'1"nAZш,u^?BXHkHqX̓:0_`+߄ڵwYULb7 0 ۰ZHӝ U$eS)V- n[^Ӌ[c8 h?bz mF.*yo5t1BP/d\XJrc9nZu{dH]Kj`W#C/1Ǚ*).URaJZju2hۉ42}JvmKQ^#]=ʐ ,6 b}b{3AܔjE->ݗgG" Qa@TÎ kRH}!cx턕8۷ K8[g;{c`HT~ d(x3-5i«{/@ [03Xa>mf,WZI^ah.~}x^@w1Z|߅3Y5-,%} 0!İ3 J<\Q_isWnVg7H}[n \=hzZ90{I |5-\o\Zm[pW 3MO%a(W.<[˄iɮCP8m -)Ku>rl\59 Ibl ZwU=H|_ (tAHOq8:73I^c39'& UQ,Pa4X@WKe1"p藲.eRS9X,|bEzs`t!}R;e*h'`'D^tn>]2}58#NZ c*S ?ºQ׳@->F3ExnU|X3¥EЙ:P3RY_DWA ~ꭌ񂭧1@;B@hVH !2qoFKi``VR uHi3>n鱋ͿP y~zo vxf|.G.5sI}:̶nNe00IN%;9yߝIh!6LMzoWF1$bgн-Bs7ë8Njv HJ/x xhg ) :\rk3r(mh˗Pvvr˅}ԯWU @f܊g41ݒ`NfO| I ߷>>QOĒ@`GKNvDQ]o_EU`@93_ĮI|II_(~s_T|OߘVY|9n 1X\Zᰫ]ǭFUp˕1Kd]@&vݡ)N\c&oe2}6N<ʔC qxI8[ߺj龷 HozY/bw<}z–PFjK7`=EN*/!6_̖Ud>L0N*{"I nlp6Q.''_ǥm'.~iTi6z|>|v04R/}+zv*6ۖ~3r9 2ښ@QZ]LDX+8k%T3{vA/oa+WdcE@iUwn^1Yoon-6*)94cfytygO\οI&kLpoēL1DAFPn*gDqgdX((0qS6ڻ%mcveiVe7 5,WY: I `FQ/` #ˑoHpBlYd+{F)hA! nj0W`ǃ0Y C  B>Pzj JK,cg$cR@Xe gb QF;aLpgXqHLn$#gn҆q,rbL$=k f[MdWR z=q܈? CP N,@6C#qd&V PS1 D);Q籓<؄l!r/3_wŤC%w+auwb`m4불 /_MA(gAVy CHgne;0B!jq7 gi9F# -$0B EDD?kKnfx\<";RX ڊ} `8A7 ! f%Ր[:cggܚ6X"TftakJ@IK| e'ƺ7  7;ԲxO,8b $.NOFfh"'v6K$c0aLR9 <{.'y-uTDx5#8e_H_lFu>އ.f>;\ȴd6-W}ȵ/v=V? 8hoX0h4qD7vhn87R A{[^1rUč"nlqchnl2ÝC1\$Z K.;RtnqPO5zah6l[(k|䌄kdNr($5J(鵃 w*z߈Uo@Э:fr}Gӵ2[imwǫce.yNL(X9tϵbTQᆉt9.t/jumFY7-ө z>̴ίzr\K0~. G;i'c&y{9Ǻkhk@jv}:G[GU[>˲d9Ix̌;,HIPZ1Ap R!אgMZT%9l{m ya(e qJ wnսE`C:gPY.OG.K;Kfdf̂s+!Wᄇ`exexJ eM}W ME=ˤvoۺ_d"kBEo"\{ g-wJZn+51o1\; S}Рc'p6M7Nܓ|']3:ƁTү;ԫzD=#e&uGhK!;i]HdWM 7]6o ya8M8I}ry}mOA'lTˍ/QZC|[gI߭$Υ"H_}_p!~A50 ++Q,hB ^; VLyƼI|8 h&X!I)$If+!~(c'?Fmw_p.b8 *ߜ68! xaR<_=L0m;0/vMS°;+t׎WYV6d1Q?IyL#ҼRoZUFe7[|1iŨ`k0 *ڭV".=(0ƺ JF3hsvqX+#=~kQYmM#wБvɖc5.Fil#IykkM.&Uߘ1Jg:( sz:Uve!W3}WF0%gBFA#δJ䨷SEXfP.f*d@j, 7$(Nt6ʱ|2ܱTW*BS]5Abe8EaIAaLI8UN1B#7XqDGqDZ\I̤, OO: _TjB RQ\[њzGUp M,>"\WE-9"x܉aEh;cԣXf/tjFit1qy`*JR͸F,$mE*_7]z5\ZIJ/0crڸc`CyF_80fM큡kKݪS:Izbk{V wjfo2,#o|ya"EuwB~܁8HȻ3)r#7Rx뫇AEj>Y~`]VpS!\?ٞPEII_AVw!VXYȔHL+N7ץE,(b)GK9f)Gs>RzQ"7 u=iD?{Fr$d~?Gds@rxH-IɏW=Lj)%kW@o ݪnZYeWjxlƋ>,xgSr}V=ӵ&H k.(qCvY`iI(J1,,VJ\>Tc \,߹OSV9ϗJ-GkQJH1 N9ťoW12#XP1g&fzkv":r[MMl8&~Bi`:jF#LQk %i_B:`^ek) Ӂi]IcXs}ײH"88f`2 LYGa9AQN!{ Ҙv㈥^ƞi8XG!1z ",iP΁9KFbRQ8%^RZ@(2Fđݰv__Kك]/H!3ܸ+1ւ#Q2QD%8[$x{"Erb+/&^޽> #z!Uxd!R&)#"b1h 6K1lw+Rz;ys^K3C}F܊}VR& 1A}0A*1j |$'(wg=砇hfF%^kGf2@ \kp2`u?b}[Q-~W\"ͼ-d֨RAA!+Ps5cDMp\ &G#9+<)*L{~sB(kB)0f tP[ 헟N~bPHݾxi16lXz:ӂwAXVXi2"y<Ђބ#_$`X{ɱ(H(򥖜N)bxe3cm簛5Hdɹ`1X9\ cc|qirŖhٛk rfIJi8M(V S5=F}Rm{ʯIji0k$r#v%%CV&⣊_o͉eU{|BNWs8!LaZ\$PYR2,VvFBxiq9R ,8# 0o0ygopTg I=zf/\C\Һ&du..Y Vr  mNqۘ|yXY0mceMl3+_3ȏ~ f˨nIlC%or_R+Lz=kqIOpÄ~J#B٦L I,A[ FiIv=kd)Byl_<)#B-d%W9v4f1G= "J!^ڠLiaLUAoO_3oUJ [W[0B->Fmw` o%8 Kwrg0g 2& tutf,AT HitL*" aQiƝVc  B' Lb~F];GDSLo {9,"Xc"%o4v#IC"S}B{LMg9Ga@0X! IڇD]t`¨NÒ|aB_ߊ Smeղtrr<;Tix6V߂.ߜ<,--6 _|%Ǖ1W06Ezqeku`ՃחMe00)`(jRm㨪Go/g5)ߒ0!S=1yW7E;i2;ZFj &*o<~[zsp>69;{%^kJ4]9_c|q}?o7F~\`;ߤ߽/w1QOrϯA(0 CNtooе56]+szN~[Xrmn*FCi0uE33-eJ{5A7~6y}K4]Ԩ s-D5/:QN:hGUv~t;{9' P#E-42*<IFsf|Z@T콜rYVn}<2؃2@*<X[2  A|%L9`{l$oeLbGgk>:Co޴߸n5b#_F9Y$g}ُ'S_DN-ػTCjP K^ ٺ 5z-aw$poY<4N`ld6$Mra pݦ@ yIqj_/^:YA }s1W=FGHp T0շw8?~3* *^[un9OI0gp1K=-sg?`jӢ$JђdKk,)Qp2`]gõXs̤-M֍&:9[']N>\؝)0울>0N v;L ިuXd:, &uD%ԂƖ6>imwȲ:,0OaV~FEm:gfx(i@cFr\IĽPLݥzHhbKhmm9!)@Q+@|6?Eۡ T`}@ i;}LÕP@&hw4J{2)RM|LG<03Jb V*pQ2CʗGn )bxN6=FP ˋR \&g%Q$i!mFM1^x-E#GHޔ=i ȑ0)^6ZA۠F#Im K˙eFHu&w^lv'O $K6A  ;$m 0V򸽡`n"NKguE0QE=&A  F;̸ HifT6^rwr^Ip/g:Xa ; Ďyn\ecaAڠhUy9YJB#*L3Hd *HΑJBJ(2flR ~ }2HP%8bFʥjBbĊޯ2* 2{*Wto[O\S](Okx@ZKD `{"7`{ɠ pMl {?֠Q S灀$WF1-G5gZs!C Ez 8l{' R дQӁƈ #lObpE#3&2 YNaBҖDƖ誄ec{ /{4[[5`g k1~ &Oy,$X{%L!Kp޳OL.DpCLD*YhXGqYveY0%͘]M@l0UIKUʬ͗;bJ>Ӌ\8R33+ܪ~x,k5[w]1p###/ <(m~fce4Z{$J&=*U {nagZk /-ϳ<_ڦV!lǷus%X J9bJ@DTֆ:TU7 "n*Km&-b 5zDޓ6n$W > ̇l yx$X`O[L9aH]%٢4fdlQ꺫 YZ6j=T٥ӡ'=ù:(8C=*lPkiZ_WEnUb7]jur^ \'aA =Ri\L)4]̖!KIYbٗx$Z>yr4 \ʧPEǸh()np09sH[G‰uXTý*.b͒As3A0ɵ r B($@Kch՞4#,RFXp4^sFsq@ikXuے}c}~%e~t(58|8Q;^v b9F¹V$8gcxԋX}]P2MU_?)C0!B*Caj3^&ye nbhnE Y)z1V]dbƷdUҨl^eK:XZoIdfT}&@jt]sݻ]M>[f$7NP&Wf-Qz U4࿋I k?ZкZjO[oY[:^`#4e-[wvxɎm{\c$j.vx=,oŰMC\}{Sϯ#)-`JGB>["[w+mjP&>@;= {@r&usbCscoŽq[QiYIt!C0͌gP^U#ގ 'F]>9eOX;͡cFEgȞ)n$q>+S Rnŗl1k>8;>·A>($',CDUƣɴTcxFf i 4|-9 T?6JY׉,xu+3"{Li΢QN \הGlg㊗ƋI*g@YռҍT& 8eczYc*&loXdz-w/tsvzs=CR),Q7/.0 (- T)J[aץ.+TytޝW(֢T,X(#h+8>jK ^Ȍ >7sf"Pa'ZCO]rTڵ"6c͜tai A9jF3LQk %i_B:`^ei) Ӂi]I@e_E.qVq@Lf r^ D{@ Ҙ.(쎉^i8XG!1z ",iP΁8KFb҉]XѲzJ{LJ;h {谲A<&op& ;7@}{{q'E'jbo/WM[1bj 5ʃ}xi}VR& 1A}@HTQ͍Qm#9vF.xNr1#+U46 ^wa\O*U%ʳގ`ʧ_(0]q3k6*B G//_. &c0c]Kyxi.M1,Sm[6g OS|P.3ϻ{m?~cgmI 9Zr;AfK0΅ L\4]\4[^iBs!cέy͖ W`ꅜ+˜R;ت&}3smC%F%+l>ItҒNv]+U*׾(F_ןZ %ݶNXiN(|b[Wh{cCe4dZ wPn"* f"w/4/ &M.HoVj+By*kgL (Py@E=$Y9&r6ȹMKGEdQ0A[R.k%'RHDc'x8QA+d-yjIF˿;qN%F/4a& ky0FP iş2fIUMxMsɉ"5bʪX>ϝ.gBHrnWVS#jrߣjrdE UWWY`n_y܇Z9jނƃ|24`%%Kd4H1X/ 2 Bq:hž"Ji\V'mrE؄ A2 c*5( o>%( Kȷ=tv JuzA۪Mڮ(㹟gzauG)w4ˁVW jQSڇXu^Ο [50YM&s!dKW`ȝ#¤œb$0Å TjeCm HwD@4:&DafXTqXE$d+ݗ+= EJyĺ TT7pQ w2%i."V`٨+<:V{"OQ=?kڅ RFrcgW I<>F,v6,=yBZ_m.I0i)p.οdG]$Jnye\$1LLf65Lދ=Sqa\ci8I64I/sSx6ԪX2I#)U8^ؗl2L4v#C"S IA?wd2\Q9̹< CJ`! *IO f;NbR`e_F_]"fWʪk{|[ N:99\^-+@i*F: #S+e#<[8y ulO\ZH_5խi ke0mzד‡Epd8r6KmP| []0UeŨzR9?̀; {PHL.ۆ!(:"|U@0A1mbx9L6-NQ 2xeNvwK::YC_qN9c#.PLx^W5l b\Ytg\#k@Yw?Oao htw'uZ'U1n)rk}i"I4w>;\Va`A\IH (2.0PAt*xvS`IQ`eG(7 i˼Q6 -2@#Gtɰ,E:H;X%W_ ֙S  b64;B*)U kZ!Rb.1xvI& h\@ ' 7)U9VLd0D#57hcKB,ŰCZ1j #i\F0I:kD@QPDY$yy^ %ZvȐړ)Kb./r'u㥑T"@07c &VpNX(B 8đFH0|[#A)w#@G #@!ġL\8FꞈgVWr,|H?_Z9Q x6,5ƒEN `K1D'k-"FS1`ـZH@6{l|D@RfQzoX59aF#Uq`@:8b<5qIQϕ3N=a+sBA?Wbc"Kg jȑE%D{T-À\ʃuʌˤ *]J`8?״+xzzWRw<`5e={L\j6/&PGU4qza ajlSEma(rHYc󔫨uA .[ RSRP> I֌ 6RuVX Yh(Diw?)hZj8wNpK}!߽#S+inb&ܗÌ#0GlAn9b) ba*B82# 4oe.3r'T̡E*ELC57EE=V؊[c+zlE}t0iMv?{o폚:`g4AaNn ({Mp>[FLG?ިmS7Ӿl/[<1=;[Lg^ry~J[E+n׭^Vx [1 +n ^׭^Vx [u+n׭^Vx [uSu+n}+Su+n׭n@_43/XUR=j =|OX#s]I-Bn'K Y{_ !{A 0: 9h3PpA^kjEF%VE^]\;yRc{Pt|fVXqBAj'_rIP-qvQ{PZjtm*<@.`݊b~Pa@ZP%vR ׄZ' ngď; Oo \܉|7jУ)Yn}K6rRhGѷ_ tf5Ա %gsH<&F fi["%wo&yuN1B%_k~1W\_RuWyLwE0!"{B: e`e+_6V WX׬]`E>w V, q %C6+ 6ᄣU6`llrˌ IAڜjԑh\` Q4# z[rYS-R V0VwgQNCuL+YQY( <.;ӿ0 = ѥ|^DltLXh Ejj3tP',+ a "=",YEQk"9LH4RI&UQFpJ1l&@#K [L>۞֧5OP,A RSolX!w1˥V."k˨\q'|H7+|O cZA$ Az,%\{!8E%NGV; ƛUQ_Ch =ʱC~\;Ha 1e2AߺpD05C|55DN{XsN'5&h+rS87W !%&A9ԈGːc"I0UhDJQ@%ר%>QG 55 F&j#Mg(Z~`^VY!ϮhNkK)Xڠ?|5wq~EDOsr͚qThUQV+ [zTS-RHLˆ1&Rڱ)xʹu8z iubVKq* V:8TqTju)q|+*i+7Ւcq6g)!-sa],KHxBcp%*:uuI|NX%6J3JJ%[C"s&S"'47H X,q Sxw/5Zq³{͂-ߣ[gz"TvA Ga\fv~45o~64_ -Ά6fܘ/#)6<8$Y"(ERbGh }2؛^5~o,8M.|O`UqaN:Vq6"IJ jS^Z暥ǟ?vCӳP /c4 ./q׿sQswv5rC،suq?NzYN3dk"ҜX1{Go4LkMta>{q77f A).XEvfIsD z6޳/O ܽI۴%@t7p\9Ơb}nd_Cvr3%~nYȓ&"4ge%zl>dWۇ.>(jOS/ѣKiҪ~]um냎IB G-#my 믷N-gw]b$81 cvd2m؇+(coa]G0^+FVߕ1\QȒ"Y-ν[ɾz g^`L]LgDO/"ZSeFa4"ooxkYtFӷlox %7NZ/ W6N uyy|ɯ@fmhXNR7aWz؃?#wQxXi(Jy\W%찥v2E!H80ܤRBq~w+'3 uyVntPCy"fm iIP]7+r0Mu3^sY>3> |xJ^ Mr{K}6S;+}ŷs܅G ){=[?/dK\l1+ٖGc޲ڝO>?eeji8SMkZJ,ŔJy)dy7gZJWiD-%{d*Y+jp?sՐOwio-hv/< ]f^΃ƣN_h0jBYs}oxهGF~ O7t&? "C>](rXTcbDmbAƩLI-jXzM Ůb v n7ďŸA[pnDenC)S592LKU|s|8AWYgw aS&<ݖ2Z0b˧+ʱ[ecW1+c1<ʡ[ČU⍙)0Ty-*-$U{P;Y[ԋTji/pl}m yz>hs{Y\r$BUҽ.Kp}f9a<ۀN3*t%Gх3e\2ݸ ]kQr H _W$'B3;G}EfkԍI}m'7n'Y_kqVY-*$Qb%ĂO-xiBR2M : 5^rG20 8N~+x9Yt>3Ƙ㧶PV;=2Bx.*]$+J&#^;MW ]0vqbˎX](+њg UD:59's܊^+jf, h\jĵh7@?>Heb$C$` 0 WȚXpfVZC}p.uԗކkoA8W;P%'s{'/vqiJFL9hR uTx&:(I.ϜZ6ˠD63^R`/,z)݇j8Cyvz݇׆2GЎ{Hi*aʊyB޵y#EK3CO.=A06b^Ngbٖ,٦oıO$΍?ZGf2 9 )z2uU<ϙQuTХ(MMUd)d@"Tir X*Ym &+1Ql_ jɁxuv_=k ^1wdf ٣|>ɵ [n;g9QOarrē$q'ϊВҲ |!(ZU)3vp/${DgL :]mN9ؑIEDShR!YcK |֩uTĬB"1JjUW(3$\1%Y'UalYC+1we8- Ԧ'xT]/f7\&fzJ SߦI\kzԢyJUrݻ&WBɥP0C! gyW4}lU[nLпެ-=jKۢ:f! F:ө ה{dnPJtyMYʅܾh;W_iv#rTYhkA|&an(k|1k(f ;}v>džaf"[kBQxP Mn~/8Ď`*XlΑD@ŀUWR)UFMY9D(Ҏ'NC);d ݚT\Z,sp3] (#{Froxqץo>ݼWbS[mN;,˕o&煣\Fj1Pm@X\*xB-pV'Y([uфrlpKo|t̫k"Gʖ:gTytCȃ `rDPdA L݋]q`(f449m_5% #);b&#NՒjI,1Hq;"|?ִ L`L ~CD]S jHll;A+HޢH8kHd͐ D<6 :A.'(Kt(hf=N6$nZx>E5NNcy3;3xr oj/ `p-r1YWw𦕪nތ~篣8.i4巧WAAT(]iHInt͑lqIT=I"&qT}pVq{y/N&tMYQ{+)j^C`zZnvj74| by Qѓ>o#.^Qu*'/i$.eul}vr[QSi?lԸ[O;O?fǿ:zo?_o_w OP _S;t{uͻfk/COg~l.{p*bYw_.a\~<F}nI2y2qT_aOpRuB]KS Z"f#!T/CbCm9z7V5qc _c&GAFOv7B])NZ";'驣 s"#/>"[yY#k|Z.8vr 9ٴKtIꬻ=.bᮁn DEB,"8E:yLJ@><+ ;ËxHD> aK5{Vy@r.wN GQ˟h<:5x9~'ysNȺ5V烂`|XW,js.dm 8ĭ[|еg7fx^>OQ`z2 YgY&vԷa@]gkA5P333+֒s+bͷ96aF;xugkM zzdinZL#޵gǯOCD2+k+Cj y#Tu.CC ۡP^*JUJtbQ,!/)zCe :tǾv3]3#wӳOA${ v3F<͵j"ٞԦI7EZ=B `\,X*sȖ gi^UY RDFK/"k+"9}+jI.GÔFW+7KyP͋SoCV#i񯣳ޏh# GOwA/e,m-K'\Bp:kJNx2Vصrg*_l%\yTv'|}sx|dZIZT`G"b I*)/e-\ PbczRB"<j \2:hroOp6{M//b/}WwruCmSAXAڪW5bBZi (}Iܒ܇J~7#;9lD bX܊y(N>kb!DYI^b( *gP  $Vl8z"Md uΖrNN¨K(B-JZ孑:VXd\5X"t0;*^LQoMPF^lZfC2xu{ DAYf{tP"^BdP;SϏv~ORq@鐋!}F赩 U(K0:cՒGq7X٦I*LlϘ+ek*,ޤTl:~'m""<FOzRSfn݄!nk@ԠJYZ^siWr}op؆ъ /Y@4$m`G!KNu.ɏ<`1h:1(bQmeՁN"F4rlY+p*K펚cPS&QhMXeh*Ya]9km8VN>3:+ҒՍ3V["L|P | r?5?~5O%IUcJXu`k9P$`g-.;L}#O3jD"G6awAXWWط3n)z;w /jfOC%fA!drnz8wavzO1{4 ۚ9Z$tߢ+&Mq^I B$P٧+F!M =zG$EB' "Rab* ⰩN6-t!ޡ<'Ah? hD|6ݸ-ZA<-*&h~ ɖs - iD,:6E =g띲S9 M1)d/jIk̶ AmZB+oZ*Q5o:XfMmZAVZ )uC,M[TEݭTe"U]ݺ %  |l`Z=ꪍ^g1 Lo#נDKV:lM-shcf@ՍyTdʬ$VT-ڻ|B~BR-PĖPc @d,l ˪ + 1 s[',U#8|]J ?BC0-<+)喰H2i3 @5"Qq"zL0ds@聼27򾉩ȕ[Q4`UN]wgü>-f-/,Y^# @ |u>IZ' MԂsc @ꍢm;C'qRo5P-fBWV--߆U0rr%/ڵE_[F.T km0 &)Y $^3Pbc @"y4Ɍhc}؏Q^bjCCSeFe /]W<`l28aunv5J9!G@ytJ?QEeaAtc tA7}ﲢ:#0h̘pÿ~˹$?VBZwwZ]'Oإ#kA=~GuDQ)J)cHU5&ί[\tLvVGSՠ^8e`[Ӳ^ߜIw_t\sMi|g5yc_ON7y.:oK~RqQ#{d~ڬ؅۟Փjڷxuqqw"]w5}m[@ÆWXNoT*6N`6{B"&+EPu^֡1z\{DN@RhF@Ԕ!1.Coc @ )k#Qa\%* [bR3BW$[?8}"kvO@聼ݣt^3̽J" Yp== 2es6{+RU(kYe1 w*LUI0]H ̶)1!t/5gi [n h0)X$ c t@%"+PH@ ī+Jՠ@| k=RAXH*z%qP/G>H,/8>T\1aZC4-o,dehGƠ lsaUk2 w&SMM{Lq jŽϼ(Zw%o<l{Kom~ M0/Zv7MR6)9Re.O(*Fg9p[l\N֬}c4XxoKh }6n3KN_`|lST lR$]0Hu-m` e]1 %iJFVb4&SFS T"$be:_]B LbvLE/KuT!К`[>+"dL\@l Ӡ4jڄtjoaі-tr:6ukY~iNiOM?mmK;ݵC;iFW?7'b[7)`{s6{br972t2k?\;lҡѦM>0mu2{˂Wj\b5WYzт ONu༞%F[K&F,4sCgn/؅Y؝yH{>zz`>*lȲo&9J9'MS'Wޘ&xe M<+$=V{P;M-I*|JNڶk<|1~c |,[z?Fւ@tM?lxjFoٻ]/_,(LX65)-A\L 5!So$K7Ϡ".oqQʅ0CWY0Y sok@t*BDJ'R(΢+`<-s*C噼cůXR"})T)k J&;H`ATM EI(l2YspIA.:&Ls{-gHa/9l I2Xfj ovvCn0ک'荬#k ٧`MG,dA@2xQpH):@St{yoaTGɨ ! ȤlQń3BuhSɄXΥBW[$T@D:%]1. |l|L&^v(-ثwԖ <"{S[e)'Lybtؿ$h ~ojd EH2ݾ!=Gs|9٠B\yGncw8;=zSW4QsUh}UUbQjscUFcjW^DePo5 1bb?\ HTAf٩ﶜVk:?">JHjv_oh-ʣ_q?~TigF,vd6,Kݳoϒw Wqq=8Ocߪ[]*@l[ty_ӧTv*S{jU-;>#b /EsbH&W}X +0lB.]HA?wf:\Y=x}>T`עP&\:$UX.n=e%aP |40> 5 W4-mnrȧp\SŨ@~}vjC<5yFϧ6 ݀LyF_jTşs.&Ӊٸ1ycrel 0c9|U߻-Ջ]rՠfD/n?T֑߱_?u0}E[YFO@>T? ULR|Pg ]/]ƛ&qmu6ɶQ*D5h$ }2MA,8gLf]q\nTN:WM7t/?|Oo`|_>$Ο`P8kAA^Hǟ05479g=]qנ^lwcT\5L盯*}v)-Ms`Ω ,^R|۬|g@nד^] f97e5:.k߰E>zG+|?zh_ljOZO `$fRhIhdT0yл1B\lU43sPbg$a]ʆ|~?FÑ ,Ɩ) +Ȓl \t\WuᖯnI쾳 &Ta\0sמ;kH)EԒ)E 0û;?\{>X'dwgzW2zC_ rJ1Z_.Cuo E:RêC$ hgEmXRm'{kuU %Ή׀~ ijVxPKd܎^.HoJJ*Vdn$p<,G7AIo hzW8ƺ4}``~/3{`(RdKJY;{۰Vt 39`9:o7oNkIg+lV]\ޑ=W^wvإ3D66 }oir5_mjrTR RS^( ^]b.ױ&Ҡ)`A&&*oO'Th%XKɛ`^N1Hw;nO:ɬ[Z/rf*^Zջ (F@ W`1\Ͻ@F|%N;,$)3yJNcFjim 4wFm6*GT KPQELf v ^g}t< Ʊ-wd˷HzbRݮ,W,Gߴ>ݭ202:^jc ZXZQQg9vcg9 (; 2I5q Jx\Zq wf8ڸiދ2(:o;c-pClDk1J!5۔oDd;8f XxtĵV/m2d w,y?gׇwp~֛pMIA-]O]OX%>ȍw$Ehsש+UŤ@Z7OIrVNm$ͫt-KhYNun7-w>wڡ祖!͞nv, ΃E- ͽE uPto&  4v!ڲ@;= |MIH 1З!cw,XYE ʗGB,aVȠrtNqYdT˅%;g%x+UQcFM1^x-E#GHJJ'0MBJAY#I6zg!)fN jfTd) gRA=j9QNF^( '՝6J:"QȨ h`0`.hfpG(wCSpGg:XaN rܸxUQ)#[Sa!y1i* %B# b)D$2wԠ;GI A*!lhβ#_v_ bH^v!O :zU#fRojr !1Jb-2* 2;JDŽt2+?=хb񴗎UAH'{ FNʦjq:;iNG~Gvr0dh4 ΃R\xjלiͅ E ܎qaux 'yxqxG*v x1h =rI4.X(j0[=s:m"p&TGm<"()Z6>ˆ"$>ߴ?؁<\D SCBGӌcsnUtՀ%.L5,Ց1W:asg7uHCi2S|hɖFHa'.䕂M8Z&MQ&xr? ߿+&$;35&w y#ΔkEĻ{Z"9͸ORԤ-jSy;:3ؔgd-LQbPG1a>rDN(oғeT&h"4oR.$#("|l9ז U`hi)1H-hk0HN{fkfR5uD bX#  AL#N;$nd99`[4@ALPyn2 7}߈ܖԻPch;`>4QgEmX »Aom[?ٛ_jpW%&XWf#=-"͗_q>` ~t\ZsZ \쒠}},l;fo`֋`z|ry] aymE^+"T׉LP ˼gaQ-njx2 <ɼy Kn~ @z3薳ʉT2㺼W"O7 Œ9K˭.#:F }S>Lju`gİb{$%b)Z,%YjŒӃI)vuu}u!X#j 2yE)mNFť}Xbt툮;>͝'nNoq6v3>b:t@5}$W!H%^PF@%[ȷ$; Δ k+V>&c*%0̣Yb4dk #z, 00Ppn,5(኱B% T "B6uc$wVU 5'u,1H6 H;kw[nQQ;5;g[:"7/܇ܰ JXE5K>bQ^/{Ȳ@pF1a`gX,gi$X ׇ,#eYvnPO >_)R$Ƞ10 }!) G/d9T?ou.UWa d1*'QH*}/!' ¢IE9~\8?ѦHfC;0haY*Bjut_҇y Շٚ{rV8ŻGWl|Cak~/ƒ d1B ikM1yYo,g|6wƙl1nsB߬ҝC%H;^}Yenܜ$n֚)UdӑSMWY}_ŏ%<:],Z4b4m*kom l sZd= WoX(1( C&}!һZ>N5ۈzIm O 'kۇn=.i;)i%A=vCIGy~jnJO3qةXM+nLI ?|nqo5'+Nî[?xOŗbvIj # jGm-*wјxk"t.&8/ǫqo7ha6S%oRdwA³>qβ5n9&STv!cp1VfRc J5١%JƮLP찺 +ٗBwVB/b+6W5!bcr]ū;L||j&ZlB/Fgcp9^$ n (TIlx,^:2Fdh#9:er7]9q9/E&4yQtu#(t뀜'#2~w J rs+l,9٢1!+g#\`r 5!`i$)|v%)n)H-PC)K2I[ ^!_ AO?kH C 뻌:s!V$OΓDՊ' qxO<#EIgIvdUVr3FKʥ"6'$Aq8N+/3eu%L=]tVpσKM[ctuͱ+D!"q𢨐-HW M%&"æ,}ĺvnm eVC\?߫GbnҍF_?櫦n_I츾hAa׫_.^ulV^q{jǓcs'_+%6\1"Çv '0$GKh TKjc*f7/Fws8}yZw |=ΒfPSG8/FGl_3DЎ}HC^.P{ rǂe CݗE>w ys}O;Q阼 7'E"L.Eh(*!8.VΡU% c?C{ncwn Y֕+_2uv-V],>N}\*ot鼶꺂B}by&c+ @ݟtN޽Ot%$ۃ;7D\(N{̡,,d;}JiTxz72 ,EErUA  M95|,E^ӏl>gD9*C mYSrr/;sQo؊~_Dlr{s}ѵڶ*ۻݯ懮Oo#]&iܩLbBfvԭ}Y3gg=]iȍ V؍>"8}MdПXעET{l[ϪNn>hMnbhyefy\_ooqu{CVwy Vw+[fH̿eE+ON4kacw];A/+2hC]DY܆egZ&qYsF)٤~Vr%Rkթ~V*R?OђjuoUή5ĵ;--D?Wo__KqDaKpL]C 2]$?G?t%)G 1C%k(\. 2z.$JV6@=z#1QE{V-F oCփvF`>X>:c5W2)e)j$YKs%.["D4?`ؑ5'/,-͇L]HU1p ; k*nXJ1(&eXLIB,{,ق@(D!&w;"#W:p[]| NsςvYz;~ \KΏ|ٞ+&N;18x*KY ?F." MÈ:&>O5.&I;$"qsBy (2&KhryB(&zuYY|,5ybqwpI\d ͢=.r?f ڒR ׊/#j 2Q̋9aB/?b];^vsM58k~qDŽ4n0?__LghtBU(ξOU/(c` Y-stEgJ̅G}耵}}A+Xj`1@NxTR,1BTt|la {As(87ApXY*B !\ú1;+*ozAXb6 m@1v;(z,93KPߴ>o[O[B챒-SM.G^/{Ȳ@pF1a`gX,gi$X ׇ,#eY4[d`Mfrp |yT 2d ( -(!e_9u )B:#=Qe,#G w*,YlNi??U?VvKpuU>gj{ȑ_ew6_ ̇l&3.w jk,KIv9bZeԖ[0@آ"YE>EV=\yL,c05%a4UTsc%bD<8aNxQ =^:>Ц[Of.̞\- {O}q >M\ۧIK~Ro&%?[[Soz$lT;Ξ'[-2'Ԯ|xyE6J!^ؠLaL )&&Wp7$ ',v܈A;AFs# 4&+rIm<ٳ8zAz%/jn"d*yjV>/(IHg];2—</WݱSoݢ׸9fgn?R9:I6 hTuzE B @ l1E:}ED9ky(gf9̅ # r2/3Qbp Ω+Ǻ9cB)7j<۴+Ȱ, Ƹ#aREbD hٌa.Я&Nt V3M ZW1sOS#HWd `F& g+0R"Ġ\@!.ч ŶoPЗIVr`db[ V(YqxÒ()"kARD(NF͋NQ2K)%6!`BL1jPnQk>LFR5cܙ8-cz~@.NPO$}WZVx?nb&ְe F Fs:xj9d\3xi 4"cbdpB1Y ''D*7/)9IEdy03,* Vc  BNH&6!STӘHWGಱ.k)g8q8 Nш;BYKt[ ,`aER rL`VRem!̹.jрhIR⹈Ep/! [Cچ +@5RtS` dHyL}YW`֢P&$:.* ;<4( Kiŧmd|=#<; wkS."erھZ8;;^]?%(@5U t:LN\Rd]0Gx޵qD 2|*mOҽx:bˣVX~fz1ŻE`Bα seq?Ɩ]8$ Gßoe;P{V.+[Y[jj47hc3և\`(]|4=o՟ƍպM6UP:I0hR7^Z6N R.T*@'>V=׿u,G߿o?_~ϘŽip)4 Eϲבf|P^@EᾬL}{ -CbM<- A1;I$hIhdT0yIFsf14>mW4E#K@e a/ OH-r^Q”I |ɥ$Lo43ؒ4| pZ":V%zGLF 1eX򝓴zsr!G?;;΄Mv|;BU~̿\0s]=k/9vR0|%gS4a真=\ɑv3ɽD #zT˳EXxٍn?lSh(=8M\_`tNz߆A.94|^G_ V}ՙ9Ϛe+B-xDmȎ^F$HJ'P(,)d~.jwP@-+".zr jgIBPˊv;k;]L~5;Pߣ1-?=PA`'q%>$>~R!~ݓw]w\TwO~?⾾".Mm@ QH|TDb9R Q8{ebLu^RY0,^;DY馀p$kI"Ҿ졖=0H;qnUU#L/2==/WWx6d3ʽQ8P•'E1s/сAaN5 /I"pckS vh5Z`(3*h2:ڀmT ԀQELܙ8-x%zv<)=aW[jzٯa{:cֹgY.E7)!ildIaEEH8q!9:I'rQYQR~w!c1`Jaas&$gqp9Ae-_fȟbc`^_xG2 )v%!8"OzH%=Q͘9!C՜ҐSrJCNi) 9!Cg@*j½R:FÒk@UQ-l,ᅡJMG㓏R:T(ݺBp4І"xgPPJ@q&WuI8Ir:lrD ӾfʫʚPL,LBc$NjK~uzϚAZ9?oy<]8 >10 .f`rrb#VU,V\,KRJ@oA(BqXt_nZ9M~i֟0WϕOYi2~K{9z溷g5ioN,0fCsR\!˘C@IQ7[ibr;4em̵,Kߛٲ~(aһNplL)zTx9*R(zBwDԉbb*Ϣ`,`3zlI}Y`1=%f[K6ufw{$|GtaWNO=7Hg=?4.v{:}OWb:*$ZTyU$څ1mx :2b* |ӏ׫l澓b۱- OmmZ*`O{OՖ]A+[*Ϫmh&t˺aa5d=ms5 $j.9fkSE7M`vWa6)#̶[#6@\JZ-te!0ᰊ'~yi>zy(˵,쾝- =t`^<(N,tx:L+Hpx 0h㓸DKb|COORrxF(PP_Hئg%?gg|+at4*R!!/ {3cx4~WCPEmloA_TQ[,S`t\(seN 92'Pʜ@(seN 9ϒ#Nɜ@م̜@(seN 92'PFABqB+`>{Z4:][XdO# 7Sᷘ:NNTJ*OeQy_i}{$lͣTEy |PZP&wRmr#:9]ie<Q~DI`q y(PUlUTU雕,B*uIE>]Uh.sC1!nPFl[Yi.ZT@+\^08"fMLWcIj~3 %G[Iwժ{~p:K稄70Dc V^pr)r`̷zR9A4ɜRh^T/hܠ~ĶR٣_/͠E I6F25a@j{rn K|u6VA6 Tu'חi%kڿW=fO"ǃn^,en`"{&CDX{))`JnXb΂wq`8 @ s|)fv8]JiA7Yk:5m ,lc%Ζ2afܟa.yBtφ⅍`(#]~15tݫm'mzҖ5l֑hbK6=+ `WCZ喎J]rEV~$_Nv,85>75?,q{Z+!@;ihPG7 Pt-ʠo0՘Tj2KHܵLyxt 6$D$ml=4Oo?Om RSwmmHdR436AHòN[yeqKLUy^\zf J/44=UŧӎIw6:sw|B1Bцʭј)lAY+ BIEa0}*rHUqv9y2U#nQx0[ڛتJ.Rk>i2b 0>*-mcVBTs6JzEg'z\l}@Q[ `ic-F ߎI 2{WAb=ڒ]{ٱm/]~Tf݋{א{_~4WguPyP.ke]ٝ9\A,/~#;σ^6F&VG7BA#λ\I٨k9`+ RDlUs;.Ȁŋ5,6N{KG\|eG7l鲇72U+p\v*ѥj#z A6zޣn):#x+o5>^%/W6wde}8~;c^)&[Bo;9m\]ۏuxܩS7<\,/-ryi݉L*p 6`1|qG1rFfZgy2mhHH:lCYTj%k=ũ۝@}fޫHOu8)n[L:YT0J5XO\7CaC 7uVzp_&}G LK%rGMr]# e%,LnZ&ٻ;Psv>?`8|IZQY*MjwagAX>A sroXǯJy=U}mRafr¹w)>5xpsuTxsz[&ԑ$!L):dYJjM5Q.VUv~7||ctsH]棽oת#Gq[=VS9XoU L%*x,)"ت 8YA-PK)󈥂$ZΘхRc2{Z \:,U1RC[٣*̓,ujB7pMQF[p#Q⺥r^g?~|e_^tzyz6_|{c Պ5bd4>d%XBOLaFXf*'lyh'N4ªrЊ;$4@[Y3p0V6@TWT*|9blwZ)/4T{vt>h(ڀJhsss"nl ?vlH6"9 $heDՒ^\}\r#+bL}h`+'쓘<DL '*+cK4C$6*Bt@5q',b&,C䍫kE ,U^9v;k#u[gqmFf,;k(`B"}ّ*%1{@v9 &!rWyeq',k(ZϾ! UM*%%U|Q_FqdGĎsBgG#}3dz"{dz֏dzF\OsM}q>Yldyl0u1u- \=,lA>Hb3lhsZM  lBmY3䝃lӠ,8 ZQR#ѱZZ:鈈1uR3+eoy8-/)⏷?⦅Z/GrqpXã?vK'd3U.z_; t\[ج(&IVY$߳R1R*[xY3fi7;gow qRڶ `6 ê]Ց'+i MU (dSƅY g!aRNH/l\ %bfɃgFϏ"s?̧Nv<ۍexllšPTB*XP c&U)m(pZb<>ޱƠmT+cm@ Ԗl(a׋LN G!ܷW(QoQON'__xS5mqSo-O6 .U78m'64+$ygf ߙz٣@~I>U+DeO8-5oFqڤn_i}R>}|_b/4vWLVZ1iJq"nu]* /]ծ-_d/_lѷWfvI˪wK^L>QoD+4ӓŧ[Oxb:JnށNЮkY'> WШ66'ƴz>Tw߯W/M96\g:=q9ov+H4]zg,귙)~!Nwɺt6fmvf[Y.-<`Ŷ_Φ_=EF=| wuX=u3ygz_12nY|lIAxkk-;-],uDYM>x𰪾"?VMuXm_q(]\b6^Wz?}ݩk:54'~w6?K"soo~~{Oߒ] d*[K+[?qkukd6ԓ_i_wTY.`J/L^ )fs/+$6YiVUUb4x_r}ބlČfvd5>7f}քDI0X=D;Zπ}A’8{#f wsS Ff D@׃bYLr zaj]^}8On2 LyAS{GdQpcR "H д'+MSeSg>ܜJZjsbǴ5>[;'ZXK ]LKpU/U/-KjN~Js?/\?Ï9_)s=wTܦ7s/ߐP+ 42`EM=a P)b-St9!alGXToO7f/#(3?^Ÿ.=c 5u_o {ۍ/#;y((t!d^t$%DB#%CύW!35\G"`ҥ@@(-!9]F艂~Qk.|q wܳuV㜭{sOlx:S(}ڈύ܈ύ܈ύ܈ύ܈ύ܈ύ܈55sSF|nF|nF|nF|nF|nF|n$5s#>7s#>7s#>?8re =}1p62Jۛ\,dsV. eߧ>SSX> K=v{.3~2͝ bo}(tOixqG7D?mxmwW?~7 ? W"EV 9U,l6Em( 5FTZJ:k_;spsW&ۨ+j/+(O3B;^t,Oo5 fI f|5F˘ƄlCE "%F'p #+,Rcd ʩޑ>x7ys~,ˮ9|jd??RNA#Aagtx".rxH+ݩ-Rjû\ qû1\0w㧽G$5nދ6 ;K&d6$KO[V4<բ,e9y Ԣ Bą}e\hdX±l BV h^FcU:%2N*a2$[ L`VgKbЭБ噔̤@D؎mǟ[6B{b.m&>#OߏLSRGg,[:.3MX\0&ZbϹq=xt&9΍s1IOgrBn "`$RWg/Y@o*쳳կVv jELѯ+Bؑ VYnHoMB&lVTuxCKhDw2?f.pЄ9aɰN@BD/ Hw>*TBjK,ڂaNq*vjlr >x}û)WqSCX_ 1AlC}H}OLyqШRIuF۲E`w>۲E\aO}@~t|ΝB;/B@5>щ%4lhdg+PqUE 9Ф CβA[dtw)E1d U"\1{t%ߞ\M-ƶ9 gBV@:?mʹ؃-[t8GL}% ]O]_}Ҩg[ VQ ;"1oCfRZ_Zw뻧6t:k!1;CjQn2[v~}͝O=ϵ\?Vy9=eVgnNi O&ljNt۬2˻66O̡n}l;uOS7\>a:4I;wʔ"%F~i ]}p9_{֛p{zGs)T2 C ]['HQ,W W۷_,8{5b) g~{w/>Q1UFzش?|wwOof]vݿN_6i0>ʽy7I+{MlWuUalYӛ7'7ܪ,MQGS wZ)4F -zU$m#dQA;W/qD8,\nWD[[Zbjkjq ?v%S-䣦=nD3b6*VQB=BfM`k!'#aă!K 5\coYSЎj3dWEX^NpLMɌ'K(=()!MD6Y81RZ&ٔ)R3IC['H9W%}9ihVܸݤ+Wby9 3Jw ;-àtžrMc3vE0D#\hQj%As.3xtZQgL6 dyUœg2lA]2>H B[12 \e):vV9:U%ZlmkrJ/\S"&?%zF =3 ibIIԎ$MAh$4>i"Ulf(]&tBքٯo343!1|)"s l9.1O)b֑r;QNKʝ04otJ:&<:#ʐ ŭޡ!:&C6` 8c^iiCӱұʱk/ۦ}UhlUY jGڽpBˁK_W+,Lw tYKLتT-Y\NO~v'_\B4dNdw1#LX`v22Nxp*[*N&, K^4a13 oӎĻ?teWfXbRQ`tf\rI#\S*;AoAw)]+yzߦA }wqisի_ן+HE/Nv=.wlwWW8_sl9Pkgې֑RZ39|siA:Ht@|9ɨCzU4(uR9wm Q@#WSk,.e Y_O<9A RP:wQD:3*1nṯuBïg3 {xwWnH<bϻJ4:Ht "D.Mu}zOq -/)7ӝ .I2E.eM"_vrDGz³]a]i7\fodj=]M;-׷e UZ?,n=Tsaydtw)Ewmmyٳ8]- 97Y  e f"$e[?wYCP"1; kzt1Ef1:9aw٣7n wI2l?!.|4xM\aDHs48 @MI0Suí NlBRgxnH(#2Hń"'Bj :zkSZ#gzhu˾!㩊zt|w379,Z}ONkvc2v=R p`|ϯƠ?rG!,r(/\<3)]@$fiδe;ȞڕڸMр1fYq%'IT)@mcv^D$YMpCRǵLJm9{؊R^~j[H][Y  guCFb}hur7 yg7T8>WPL):"$y,Ǧ"iT'cmʘΘ!.eJ"H>*T\& B I(+rJV2$N)i'-ae6oC4]*jfiGyLjGeY`Y#gKVzd%eJVZy *b W-d):ΝT eYQrR:PrO o4(2^=f~X8;9>]KfL2a@*}B`T(䮙G^ PḵW(TtJIaW"n=օJF8:)+RLW>V(pst5_v!# dI. i 4"~d0XYCg{ar +CRg׭*_Pw%p{>+b?^w~!%ۭqcg̽1g̵pr^Һg@TizIyH@Фy r*W& xsGz(ZGQ %OB-Zi%C;:[Oysʷݯ5Rj ai5/ˊQJ@{K k,& 64W ۂ 'UqH<7q]b|t \ \-թ ZD%7NFw.^ xA/]t Y|w.^ xA/]tb>bײ݅y$ 6}Fa]R U BbfyH(&qCGwP(&\(6i 78dף>:|l5cBɠe3)t(؏[*O2י; 5dGɤǓRrL TNI@9'LLB;K[e9{^AdJKV]\]y#ߛuRyTޟ76ԩ B5K| "024QŬ )STIL kN_Tap;))ԅtY<+KyIXeʂBernI" $C$IUfz}< E> ,\7NX7uƨ'R<;#AHU7 cbdDں,رn:niA"au@5}D'3RM,${X'H cs )%ˁ{PS_y@h(E.c\y*]|JȼkX`*z:[WCE4kgo*M}xaWF bu^%#ɫԛ]uggI.l,{?3/dm!Y|_E8!P&&xRݰH/dl2%x}_[ %Trpc`,yLYHaKr[?Ji!sv0CZ*nV"kai=o|z}JP; Jv>L^wє?4{&׌/m}Ke)$>~Qm%% V6틭&Rt2 OHRXUfol$B"nb hѬVXvImXh?pM}M)2/Wxh@7F ,B! M>YmwAV:49g#۟h&=%sYKbGҽ. L.ef Ns!X9\5T;j;D%g; w+5s)yLlr5-\^Z5/&H3WƣHIQZW8B3a#׏nNۃ}|?Vmyf]QF)T5 8X/Ψl!OI }\ss_ްY~4o[at;Sči4Mb|+(<6M\9}ë,?}sW~}F&"~_⒡,ًq*|m8u+,ߗ|[aVPv25j 7aOhf!3ܳEl60RveAts nu\W [=&W4v >cs[+5HZ4釽W8)>oESN*PqC}A2_yb&0\5bO=K'k8?t0Ҙ-0rȞhAģCR::it[9_ QtV?2>x5!μxzԄZSßٿP}7n9D d*j>Uշ6-Yj.iӮ {[\6JabdJ^]lmHV]}>]WX)e$9y2Ԣ Bą}%#4l,hXrgnb%!(a*eDŽ⥓J̅3I !2 fZ6h[#gPV+NG͸aN]?ڬ7b==-ӏד>ZGwWtwjvi9.ج﷯f= Yѝ՜[v~6݃ύ=F^s;?4WsVϼޯ2 q~cy*^R{ɚ_ܺoU{{3|7WsdAδCfu6]rO\1! 3֝G=e ïM?b8j2Gkb|NkQ_eҭw"#U&YdXUJ Ad |w=1OdVd+މJGdy2sޕ4#鿢@""їMm! Ӵ[gGEj" ID2LJ2U[m7^r wp1cZ7/rCj~[WdP8d 0؊q TQ= %UF!記kRgq˅;>!G\uޣcgJb5:Ov8Dx27/~V78}Yz-},٠6|ml$w8 &nYd|hQqfO^l@ahdc>f%@d#7(rV {My,)EKJw]vMIY2-i*foOͬ9S)Rj kSP6b.:i( zDn7䮩9q8;%"I`8QEJx5S]$2j5q  J qZk5Lw*[rT'X@q8붜-q6կfM5RK@ % )m41pu cj&p"$@^+([ߋ=!^! E>5FG9,+*Pdl{(@{uf^juZz2It #9RvňE1TF9W1YWzJPTc6bpGGqWA/1~$GS x3]V*R㞜jIdkԆb&)pٹj`4_DD -MXMDv-rƤה9,\M*C Z _M#<㖌6|?tt]ѡ;C Y3'!۪?7ِ/7'-Kn0&7N4=Q8;@ ZJ Z7+-t&%>8Ɣst:.RRa!] Ne:{ ak191`F}WT wC@CVUXr:팊,:#ltgm9yj|mN2L?*eoG ]+Z}@W8D$Ҷp9z5*OO1#\Fk<5 B)8qq ]sPz-gnO)1!9R"*'U5B"P.(ARNƉ%yHdɪٔ/'°T}_KNa*؆r&]d|ǣ*/(!C2 ^7W;:U` 4Zc˜Q1 C*B>B -rAѰ#thx4b0vUR+K!m b[-&4t{8هr_: Eǁ   gJpyRLsBo^+?ܸdSFOPjd`B"zMA;tr&BֻIN'-k^aqM(~uߺ|.[N\I;RE+j\\m d5h,H"9h({+jȊ]'w \Ñ⋻nʅ?~Fc93.CLܤrR~|,ӳk9s8;{is-(}<~~1] S\+v/kF^?]ۑwҿ'}5{i7mox> 17 oO>Ʈ8LNM*NkFMslء dCTCt}T*`(AT N&TlxNT1W:gv[UW^5=&'>ez}UdA?5E&3̖byZyv5-hBպ*0!q A?)pQ h]. ^a(jF$_s1* іD5DYWe_2jig-g:ǵ,0]jzIƎuRԨ >Ev@^sܥ=t*մ~x|U#Xdkᢌ BD6*Ap5,NXCb=yMr|$:B;0ra<'T|3ڱrAS)o *ьT.R h>ڣ߳+!0`rC,CD,A136FJurUň%$iG$SE-6dG;uoU^R!&MlSs[ڐy7:|PU]S[ǭ43wgӾ<H>4SJy*$ˢ74PBbZUJ1:RRHxF0T R2tAO1&{G&vfWEdc~p p?\lV)TڊֺdTQ"Jޝw-g3; =<п =lo 3 - ;ϿN'CK6XΧc ! Nf-WS/hsmv/xA#iPFo4DbeanH iu`Pj6.)#wCñy}I׼Weyi-;.ɓJTІb!m_vU+2TE:[9?"o|eHEp1{&?Z}g>,OS9[Q ?kI=r$[~ւX&qmRFS_Շ Kx6+ٵ($ ޵4#v`ፄ#lOez:p)qEj-IY$MB"iU-X @!_>1{64MN[zɻT*\5]p,,cAi8)3?,F.sBuzݺiZ4oaNo^LnnE>V^L7aV8ofeݥ?QN\:Fw0 qޣ9Z͋GmLu>yjU`! 17K˫ \#Qo8v?i[, Av$ #I~ax0 ZY,#45`Y"G|17ɦY:*GOmԶjarwMs)+>݅8FW FS mlTcY8} +܎w??2׻3J LR(7$ 8ow}Pkho>4PۇjkzO]u7Je?,D_KJYya\K酚irlc/οd#L3qo~Hj0YEߞo&6G3^#we&cb~M|_&,;sV+7u$m2){)!;t\pˣ[Jn`SJK- Ln5Ҟm4I)h$(Iw*Y.r~~;+ƥG2nd5x׊y<8k[Di4kLvR6ۘ h9 jzfu^@V/zֺ1Hݘ֙ lvku(M9dOi4SbU Ip*;HKXJeW ~z t[qd;v2 hƸWgW.|Hb  UBQ.=B.sI-*qɻJ%]E}Mƅ+\f2Ra{: \֨ EkIlLHs`]UWTK]i!D9yTԢ \ĸ}PvΕYPlapn!gԸ0 ]qq{4#i7^ht.׭ҥ!w?ɼL?܌U'X\AfƘ2.Iq`FnRҘyƼ1ɪ,SكwV'"\A*8p6|!' BpT9PWFIn]<;[*mfN1] a1)LU,L3/5#ː * ֆ4AA RΉI% 'qB> i3g {3{R_LLF3Ev<M! 8iB.y*VDV8 @!c"#&0|y^Q:Pt*NSO< @]s\l?ȹHFZm_}Yqp]Bɴj~!}I \u0][`c<1^=^!=#-8n FԜP)T}[ճlǣAg,55h*P^2^[DdHb2%^yK2Wɽ,v-nM`:sHerIK, TŻR:˒6ʘsfJfAZʴ6&TSZw~%dLz.~c}|S`A 9yTf˶H4hZpj_qs?O> .0MX9` RH; u1x'A6N˜:Hu.#StJFB΢n#KÍfS^2cpi(hMf(Y򥖹@FΎ1xӳGN|r>J 󷶾wpai<'bVk,>L vW%=ۙȍ窩Dn]B.^ C.=[l̿&֛thґ߼Mؒ`K2ۭwiyyx{3=ܼJ]{w!Vri\~uKdz(e3+PIs5ӟ7ݵbn y%rr{sDthW4N7?nLsOn_kU $#@%2iJ֊ -USvZ^y+0 e9d4l͇KPwPNU }q1g gyghȧX{}n%ZϭGtש4> f Ul6~X8۴kUv#6=B7  ru|I 32 b|dSk%pl4Fp$V#xf[jig9.H4䌋:SM`,Rg! O-H 1TCi#Ŀ9ٮ3*k"P3R*A[J+_ťARbC}y;*Bq.[,L&x:K"24 >I YbI9+=Z:έƹ2kΜ;>kY8吙[T.X5Ls[*RJfJhk7.ז2ȉ, Y"C2RC$ )'A9;YW_6;6l D\9 hE+w g!xf \ 0c,8`#eL㉮AƠʻ *Q )#NNɊbE$pGjUZa2NOZ0iCT̉:TTsac*iFK\}(pqZĞ.b:F t!2{A8F#(si.K͜RYz%:ngA)mǧ|նȞ'h]v`d"Fse9XDX<$mIF; 00} Qm2whܠ46!MEV͇/F˷5@DظŔ_I)uxyoQzrA IRp%f-%"i`Ev!)-7Cs ˏB(XjDkq}PV'Wũfd.Q <4Vx ь*p9=GE¥YeZ85 ( as3[Ə v1yx˵ΩZX^0\9oϞ,5tN} 2ȑs&w PyH1@e9Zem9;V[+甈z,n~M~"zÍ$) BI\Hn*6qچP;!Q r nJ>dz$8\u"Cj+{U1a:itJ<Ʃ(Z$'B.Hfi-bxfgVK%sⱻJZzk֩m6UlA9H@ͅ s ѫ6A$Y%iŇo{Vu]tOÑU8[*zI|h2mao~ÙBd(5_~{yh(E &EɵܜKK>B]kL\UkVv^|IYI'@5;娊dU.\'v/ҋ'Հ}Wr/:R&e/ydYVC*>1fflpV/毽usA3jF?Or#Ldx-\q_&!o[wśEX)Bx"ĺI.4YYfyHgZ&DJ J"<*dr8mulew6Clύb?6Ew?lwFZ|C{Լ ѕU:yְ$*},DcYd$ )%Q C7{V/plފEl{n`0Ic^&x%%q߷xtdK/%9'vbIS,~_.F7ǘ?ȺԿ<.qU^~'1zʁi!Ns] 9'h.KT4ʦhRh Rtukp'՞kZْHH7򯘜<$iݠ K="3r@5qPRE tQUצOajzr:S2EᤓY ]9hkB^N[=ŗENf385 fdڲ-0V-ɸm⺪es{V11KN+Q`z5״8(Hնy@{QnJҌ'B\Xg j {_w68mszɟq:r̈́D( vKz$/税9{z3)̂[RMC. &R)5ZE6b&nj͂:Y@SbWgA\ biSvҐ;!΃ךXZ;N~&놃Xzi!@DQHNFVf'!2R ĂI 8" Dr~^ XjT(*[DY"0Vp4Q<(meD%r1>`Tҩ8 HP(rJhn övw+[i{zmaQ#vYkivZ9Jm59mO(xfT8Fp!?I$-GHJ+N̂5ۊ{?I^𺵌 UҏRƛ BCV_SQk%MPd2qi+Й S"9#7d,3efŘL8tOui5 )r=C x/vsr Zl`ViydJS%gb =^mI\\!]r<.X Bǥ P<.EZBR*ޕ~=GOG.9ۦ ZPPAm8˜QFc5)Z}Yc/4` ц8רRΘF)O5^+PJ.:gVW{u3F`B& NgN6hi8Ʀ "`d\@Y7ip[iv,ù 7j.Z(`ivQza˴ϼxz41%:,\-rⵛcG#QK]} \&mF BO 5œ̈́ i6{rW J06AxɓF6 ":Ȑ#UIZ$zMPnR Һ]fu&CWx@Lڍ%n,+݃ GMq4j9֩aH'. 5j!|lWg/`mB'ֻ={y:{y:{Q8{QS=B=$(xD͍}v8 NFi ]ύ9>I2G삚 ՖNVfу-cѓH}9,=_rQy&sܺCLLHKgEfGɤ*g F"w)q!X iqN:&hYTI}5qf;=b89.֖]^u@˛N!3~H>kxcY{I+R%9K5MEDh9m,ʳ!x\8LAkWB7Jb# BRAӶ1I A* H+"9ZIK(+V ["iv&~[K!/K]"KyZXAHePJlg12nD6%d&v/-:m! Bpِ-HCD" MR:ː@pP0- )!)%ˁ{F9syt)9e^K+pϊϻx5]H}%{K~9x4_S.Z~o 7$^5Xt)~$ ĥ?/OqL}"U?*N,u 71Wsv Gn4Jg}4<;J}L6%`=w]*CVfsq5i4 KT4( 8b6t1ik7t;wi>~[ szx|vԐ[ix}[#8Lx[H^>qR]|Qi[{usrwl0œo?|2[iovI _Dw~TkHWڑl0b0FfQYӢi0l`ŲDn<=;wsUGQݫnuT]lË(W1?`߃R{TN1r/:ѧ~0׻O}K;_h)Dl,P>C;CG mj M͆&yiwZw=Pc{[Ry[_R~~ t: tf{`\Ωov/b98Mlr>s::IRf6~{\n_8 1c9z|Оweu>1cʎ.{2x\V1?C&Iʕ6٣#-s&"nb hPh׃b٢K$af]v|GÞ7ҙJȼ ^#& Ʊ(RE"B!) |B1eC*:4<ߗtEkO؞sAbC孛.+pUnpK.[R لfu|i31\#' 9iH>C#3b " }l.g_}2sY BiLĒP85e- busV oԨw>s_~ qۛݜk=s!w9>gt2٨Z2fχI 0YQ3gIaEAj&.JU9.Ƥ`9do:YεT$ (>qM G$*s`DUg ^^:yp~`Ww_՜bҾb=厘Yufٚfrtq=)\BW )62ߐ:A! wr;r>RVA!4c[O>d\3 ^ڔrSr9N/ryUj' EN&&p|Dּ(p"h Xv˳h5qPNlbr><kc/e_(뚠d0|;`l ~ `nO@LWwW5(aV5ph3 ]J+j`mD&$ѲXcdrЁZˢG'-WRW=9;O~>;[4D .qUpŜ~NR&~l OANҠ7LmQigoo12^ rWS>+~R,#j;W6yP.lKbJ;H#HEVM?=e5O.IEɩ`u#L (q7yѰL?+g))~Ł^;2R0rȞFģAC24 |LW#JVh|_OZ_{Kx3"<š9֔|WGd=_ vg7ЇIy@eLEI=ǃ)cZSƴH r˘)QueL2B 4k{|zO_ J&-?53d!vzۮgeL LV1kƠ!dǔeTU9Y,uQHYFD)H 9٩\Ug3g#x1|%,_WД{ SAKm&v}K^.ȝˆaIHYqQr>\VhXjmHwc^-*wJ6_Izn:ǫ~-B:cRN%t&vcPͷhE)o%9YcJaGn^GDJEp|Wl_[(*_Xf! -ɔ51aݛ5rOgsWگl,-C/V݇cpJ,FѦP_߿J6]GӟX rYS|uW_Wos<25[W_ͳWv|e݇4Fx}F^r>j[A]t=m?eE1Q_s WzCO[\]{?5g7U^:cW<7?o@۾ o#Ygt/"$R]&VtZQpV=u<9O%'3Nw͓GWWǏ >hыC eg2HqHa4-z%=90'qv.RDc%6}eC79d^dyfZ~ϲPnUU~~ҳ)"mڏ [6fI`9ϩkϼ_S)xF:ܚ( -bvzn ^'sk2@z@\?2P2(6L< ujԥtPZR[˰]B6Zm[ظE71;%+` |t(QD&GI&˦^pEi 9&Ky Rه 8ٌjtd?{WzB5\Yx},,*exM-~e䕏||ņgz@uQhITsHaVlE,$LY+m!Cmu*})r7d֜99XP+NanPMZE׊T (D` @q夒 `E΀rrBx( ڐ 㬙8;?̖Jv@@ٗUh0(e)N+{"6ho4S_x,[t:ESe4Yx/I!l^YV{ s/=Fj5Zqv5It 9PdudA˜hPXg`$/hc8N+֠Nk"@)v2U)DI=)QHhB(IGSȘQmvDX/6~S_3|Jb? *%KLKv0Cb!M>]ٲ_2qKn+L:iȾ耷 eZHW H%]mV?O/% Hbx4̓Mv߯yQHJ,=SXCR6hqBRV)ȓkb鄤%˛I$1钥'eDR9酅2[$e 0WzA2oS(>v0=t^9ؼ0+B"OgHPNfl˔$W$c.2~L˖G*KdJX8䃉JFO**yTHm}:2(s@(S*: ;YY'g6bZ/7]?օM)EAPH>GF? epEYm /m2)3ئѢ+)LE#QKW&˨jmC,FrLV-43 l;](}-kRo޷6MOj~ttuur:-+EF;o$BKG@tuF'MB)(VFrQi%)TRj6umI\awO/\Dtb7g\nZ.=0勢ILtjYuՒ`>M1Y1SI4XG10E!33l)֤,YbLF R1IDvYn{~Sc_(9X"nrk;(-NeN0{"#Ea@*TnlŕDէ1rH[A.{vqC܁b֧o9R܉xIP1Mfro/$xaiRˠZaiSt. 0LɩWZ[IZfBXFDEfaiwS}4Rz~7f#wKE v.ո|1-+:yiJ M)B 'rb',|$L-iKr|tʶ&B;0qa,h挒m 3gZ%9ERz7yNmY`\ɦ8 &T(~XAԺ6GK.i<{"Oݢb̵QwUy{t_h pKo{չk:~KU[ `iH_~3C̲J;(d/55JaǨ# `q-KT9FZ"Ry gd< 8sUU\UrsU%q̕ћD%_ y޳Ǯ x9n*ʹ 7?@|(gu)SյW­F!5C RQhn^,y i^'[%m&KYRuAo>/ʢ\<;jhsoSS'sه=xMvϧ*6׳B?r*ӷC%t*JA R|bq)B7.g.>R!qT˖4/ofU٘^IbD:zID$R(wQZmhL}EU_|gQM"*>6BK"99C!Zfd Y$!A v3RT\q [RD:"Cp#uBٺX+q'XE 4(AOHJTU*x20y65TReنTѲvZ>ȶ"9e-&&i|B֢ipbl[-{rFk9{ԠZS:9=ݷ50ܸx7/o':]}\+>0yMv`l{7ceh&U%CɂIjux.Pm %d Q>9Q/QM*vs4oYkRlVn?`7{ z1bDbV IT 1#jJʩd ળ<UzQ9tX"Z X`m=\^,7 ξz61&^|p LƘhrrìk j1eڙv{db,?pqړa8?Ðd}jEPZ&F^$B)B*Ԋ}`+U("48QN0ޱ:Dlɩ169趜=$xTGS2&W[Տuή鞟mauN&_٘<#O 7@wCNeE 8vE U E닱)7i}7Y~DYB Z"^ l /0MWN!y߻{fJ'3ˌ?:$=)=%'iW7;5+ϑ/@ ]eE sce|ǜN5-h5Kj%}Ր/n<ը}c2SՌ-Ln/h`ƿp~p1N_0G"z _/WOn֋/o<$ONkժN;HRѺ̪Pdԙ1j:!FH޲ j3.8q~jlvu8cWw:4hp![x`%ڠdaC#-,!zb}^T r>ђϕQxP]=W}g8o7r#ӻ[ OVqYGL>R()@% aҌ.1ɖ,!0Uc")w2tNPL8LFrEK):JUp왌1}VҒTY)ߥ/V!1kuIsQP(,&"fLPѲz붜 :_s8\rQe#Ћ4Ȟ'.yh=\ƪ.'UV]=)/m[!$ )R)QUZj!kSJFł dlT!h=in7=NPLYϺ-gO=[?@>-ڢ^AJ<3 z,51`7 '0hGKnR*^ +C~cwbڻL(RQʗBEMЙ{B,lIOW?K)bdE1TF9W2ZWdr-1p'p^/8hoӓh!n6]V*RVaON$5jZet1lD\\n="eh$秢|D@W`z}Rk);%x\Ģ`բS4L /F'`lH7NE;Iȡڡ;kN&}^?H !agbv,D ;Y$L&olzbPr؆ՄG> 4o݀xpXD MH I1');js7`LmQʠڐ:@Aki\c+5w@߷j !-CV *BTN'yQrv4>oMogiց0>.H;|[r0E$ԶpT}G+Rp!h.F4yk"R18!8 ck&moNk9{֊} ANISkJ+¯rREޯrT^F.X,# q<$kbqPڵؔߐP`X1n=XtʧصbF+'C& -|, \IX7W(7 tZR>?R =ry8D;RʸdSTdJh5Y1vޥʒrENv*, 6!SzcuRq]&iR^;n.q WUQJڥl8uWK9 ڐGݜu4HvQz{+tdu7 2W9 }hc Yl>o'~V&(3jl?VSoV zj6/˕i~I_Kd_w�KX TqL~ uhkN0b^!{ o_E̻ױ[`EGf@:)34 ko).j+`l!(wc5Z>\CbKS kt')F=E _>=<>>{wWW?F>[J?YiۿzAye}I_lv۹ڽ|m-n>`O7ۉ1Z>_e[4,.Z[u4|ɗKjXE:ؤh.P3KzW`=sK瀵Kr%K]\e1EtVl0@ꃵJ;Y볂?w8J~jeÃVmOP^ݡ޷3yG̡KZP(Vpz 4u&%LV;NAqGiTGꈿh7{Ӣ{H:j(%mFMIUul&NB#JRFTJvckh6ULjAkd&}+9Fr_wuwb+jȋ_.om$=&2rsqgb*\m;^O}}ss \&".mTnv"_ 7Նw=]7l莭T944l7x{pv7Fn-n=d^~/_1qqh]oIƹ3ޮj^O1wƝV֡1M6q7植=67缤<[lKpzPvC>0$zzeu rqV~Zd7bkGwBH 与mp]? ұLTh>ü#O>awɧcKFF'RM*Nk5ٰ@K%c&u V(E%YRE; SQ&ph.n٣E`Bc)&'fﳟϽGUޣG@5m4T 2%&G.P[We&#pԊ9OEc9U 󈹂M%aBrh2j9r#2v[%i K2sk&[xT[H{OV//E|/ <]]޹VVRl 3z}lL<,FѵFаNm+Zy q6 b4`os0CdTJ;[nmU bIǡVs '=!]^XZMJ Ϫj|th=)G\u!m5ed_7{X@$+ Y &)(6J>* .{鶜pANa<M>E4E,ֶU;5eBU`7>ǘy lD#HQL%иP+Gj_M 9XS9&ŗE+0icp6 r(9KO!zF:ldKNZW>;Rt>315ʽM( +Tnnۋ;\.;!aYA Bq,)FH1rGZ~zzJ n#_/$ۅk"d-`&XWm1U)5jCް5F1I!sU-,~3PڧAÑ+YcY ls!WW[G2a`=P el&=J&Q8ek,YHX9Rb=yKr|tʾLf(Qc4ںQ MJs_j]Ĉ [gѕLfj7jȲ(]sǕW%qe`?(*VYFh$o3$ F!',v܈A;AFs# 4&+rIPey:K#^I}AUvYLɱ5Z }Oef{-n*=: ub٠:T%uTe~#YPEEl &)u"Yc>Wّ*̅ # rR ^!c57ĎnQ٠X6z4נ|]vr#LF1ea2/W^!`2#ƂY#wVzL N2zȾ4$0/J7Uj TxGDJNcRY=( J3lXG$ՎbrNJp8h9F=n0k)tW8 Nш;BYU̥Ԋ 6aER =? Ͼ]OO0ɝ PH*h))%QHqB=#ex P$(:lT)1;9yL}Y]LZ|2}m6zsx9Zƭ{fmUPVkwHe2uY(&HbcOp1f5NѩPw6nxuG޽wo>oͻ0Q?؁ypm~y$} ׷?chkhUkՋ =2֫㮌ʡ[)X|8krz0N+0$6ǾW`npV4#vѲ=.K_E>R+Umw}Um;M4q2_Lʆ4`# &"# 44)Whά;&{#oRV18yah ilB 2,%RK0W0eA(CD3 cI&&Lr>G2I@w${,o-:[nU[JPp "g4`pPZJL)0@-As\^R5uD bX3Z!Sy.4+촓L0 Kyϒ5SDcWDpCzXZYhXa2äl@&N'$e|R0z|S6);Onƕn^ [7i|ԡ23r;ԙ0W( T̤XD0)N+A 0[tϼYw^I b5TK&{zaT) 6PFނ-oɾI9?4gc[姭/Afyk#X`!:RjKf(LBDc7nc,탳n[?g(Ȯo]/!RJ#lSuRrp=*W2ކ"`@0..)Ϡ|.XOb۰ɕ޾l;l>/Ɠ4lnJ_JNh4=Ti%}]Fl) uWHyCD$&РEAERqXt_ R12`K.-*,ʠzP]z+ϒ^2$Up;oB!ń3cĜE NbzkݥWt뤍&hBX_O%]4۶:h=ÊJ|:O_o0Hxm g."9%u)@G4AVQAv8d4 Y?3'Er؝μ6.FױT`B<UX-zgi&yeĀ(hj4)\d~t6Ӌ%i"wI}"UpftݻW}u5ɝSME|WP*LEɍ+H3oKA#.Se ae4 t?]5XA7Vs3&t@ڋ'L`/B(kB)0ZR3J Cm5xڹc8Q1vjq0<9<ɗFj>O SN!+ãT]lyc1^r쬅 dRKJ1OTN1k9/B;?)ոlSOއ]c5nŬxuǽfN| j6'D\rȗ5Bl!e(Wb?)o_qyjϹ t%wjhė˝z3_Jz{0F \ֳطUvd%?Mʢ-u^w:{ 31,\jΫ?{«3>N|T}!: g<.ԌI8*sĻF=z}5҆L; ޮOmJg;=Xof]&tus ǚ}y~4]'AiڰV}v>-;G#*9n\K 9l+p.jzw;zKDjG*:8GH܌\y/sw[~H48] PienuA>[uCVG3m0Ѫ‹,$6"6DTZ@ɤ8KQJ}zEc9đcGLVlB0H# be}0z)#"b1h#2&";t::`OqJ='~'vݝYjx5ovN21˄8ݬlKR[73 %>B,^;8s&Ck9AF[R"JzphyAG$8÷"<0?0 1p5j}/AAJ1A3n#`2}n!EC"3hYHgz3<λkZg2ZEO/y^!2(2;@<{QX ,@ʸ0m ꤷS4=#y^7nVfy}.2=7i{Ik12A%f1e\,hb{axtL8P3&»[| m* 91J JVIk6Z2 W('[\bkO!x3 g昷=])̞ YD}2< eS*K8mTX_g,V`vzI1i!7h#aă!hM45\cwA::mw,).a 1=R7̣A%%d6y`ԌuGx[=mhBREBKHF\dȭeaBR.lYCR۝hy!)pD.Q>GŹLh 1'DA!B[ @)N ,$'w@ a9l=3*BdN_ʱ(AKQqjg܎SC$'LL!W 6YVhΩZ"dcVY\&ug!ZPg6y4>@g wQJC& 2CyHBYC0GcyEٵO9AvIRKJ$?ͯg9b0Ip2$I1EԋJ RdY҅zBW!Զp4XTLĜ*8f87 ”/D 4ٹlvYENmխ[Tn-W cA[]1|$Ր \xi-bҤa3mURq.eldF{B*U,f9Pquˁ}}Wخ~9?N@u J `Yګljs2ӫ/v"VAnd]ֱ.=DEկZ;Vf^+OCv5ϺXQ2)o(%I=r9 M&I=YI;њ́|&S\QoxivI%yVaXݦE{|)\zBNs.` 46EL|7ah3xI2jY&ȖH寘<$InkO1 C˯AHQL(A}mݝ䱧s!l[Jy#o}*RIqe ݼ5eRjftl9kĥ"\;u+.V/ʎ"^/^y< ԣjkl5Zw\6;M&o(}xIiO`1&%:%(H PfڣgK7~}i(ztt:]܏` ~5${sٙ>kߟ~Zy2݇JB94!h( KS mYc3x|&X ;nXE*PD'רDwNS֢<f!fto+Enj:^o?kLz6~[{LT_XEGQIUrVč|4'l4@zyTMSO) Q0<=  "'2%) n۟~~?O]BzK:k鬮.ŏ\AHi}|;53cޥ&"ߦI)Z:~IGkA׳cϽ /K52fvf@i?Tٔ~rgzJ :0WC_ڭֿ7WxLvw8/N__[2/w6JgnTlD oaљ(sD-Ww|׸;x)R5^c AR3nQdk2_Mӛ4XLajXШYY6)P UOۯϫg}h ՟v%_5]T8j*Nf~4l g S*̌F6Zb(!Q~CJ։#fPI,VC#ӵ4՘ #|0tg::cΚ#tzEAjMN.1! i 1[  Xa\)* AGo߬I9;8}cnǬڈ܅M,EZdcCIT2 XKT|T%Q%\=Mwr "ԩF \P9C  q]B4dz(IPo[#nu𱇘1(DBkQ; ]|# O0Z+f|H諻÷D@h0z7EJI 14z&Zp{*Ac7{̨Lfyl&_"@ KjHMl\S-8 %ݤ&˱ 0yJT;T\Yeǎ Rώ-UZ!90)+dAJkfi & xix<œ%KJP,q#Jt/XUM-JfLb||Cb=Z1-WZtd|TR-irTR.9.1"#!s-ҳU&rL ZR&$nDX٥B]Qf3I iPfT)O!iPEԷo)qwǶ6!A44Y@u]PNKlLA~C)lLQa s9jӴ 0*8![C,dr"#1YcDT:+UR9^m2q$ r, <8h(E1g.=QH>%<,p%Pj:"k{]x_V5 \HWK_|k? _lpDͿ`iIb|úʬR~T9~Dn=A J?c LɲM9.+#961K5pt )Է,9IƒwQ&0frlNnq녓iHI4h~3 ]Lr4B գ:sihbJgMOߓF`fBYEV&{`\oCU߬kRPwv_6807} #~: 5gn`si.7O*U}&q;m7zjVthpD.KRi-ś3?c5f.Ikhnj#8$Jm#I[0"nb hЌƈY%= Y%y$}ՆvcWv4T"X{GBǢZ0֐#"MɈlRf~ɦ8Mv%g\`%\5PsŠչtBF^|X>:;]${к6Nj)RŤr{,7B4,Ǧx>PD 9YXr50`T,J[BDž/D|.$cs!zb{r! )E ,JH:&"*9*ՁR=]JmmBt.:`'%&&J|UɐvnC2 ^tٽp+q6R# wh-ВjovUZzؼ8hEv/ c_:mcAzgb:^=^8!ҝ*'CBSUIwFwҝ7Hch{s6vր}KyfY~(8H~94M4y? ݙ LMں[q%hp{s}ѢFOd a(*ho 9E2=KV{6D2vv!88+[1[Ȁ>wy8wXyfg]%_r+qzOC9CkOߗ_/~\.ɷ?S~գF*h߻w jYLԚRsB5[i|:i"ݞ_L=]p|G'_3~>*~_T~R[u2檊ũ+7WUJn^  CkI\a7W[O= ?N1g.ʿO^8Y8^DZփ)K\;%R.٠HgJΊ%6'aH'1 ? J޳L3>Ş#C]2!Ls)ϴ@it1NOFÏ[{ 35wlc>ny~k=IݯtKΏt}AQ ^˻Ope4z7=c~9i9qc@_]mNQpk>VXuN` :U'VX$|f311ް0[nvPqgSYo/ dFr9T'АCLjb<9J-7=㵛 Mw^It4C^MdtcR^]>os !8-_dnpq=-qwn^ S{b~\=@_y9Ͷ#!-d im tPu͐NkQf/ QCL d\ecYLʬӥH0E'g{guIkZW+Y3F=,|f?5x["f}XxXC vpBx2U\%NJkԱV);o-[>{l3JD|(rw AH2.gg6̆Gl UXBeE ,dMrTN`u7&y4GL9# NS0R)&KUDJksD!6ۛ Ov.:_MMҸ>}_f$c :җgf>t]t޼qa.6NXȕ*mn귭Ս޳/.n=m=LZhMwmV'\Os!0]w>Χ=ϵ\*9fﮎ܏y5h{_Pik:]_knc֜5Ks6Y<cI~mۼdꚹ:$<Jny^Q6;2*EF< Y9iQF x"Fm td׀RhMvbɸ{l\ߜӨ`p"MӅg9KFI$um)4dLS+&.Ϻxi Hr[|99IzAkb5"8*>` >{  Q4 !n.&8q;lkp@ /__uO.W`8h Q&`4('4l9nc 0ʷ߮w~Kz@~%Z3(1: q۪Q_WsKt]h&$A^bMvpp0@֥BQ*hB׸yIe#y:R)ݚZro;!`EF 8x+% JR&$R6؈Zcd d.]:]B0Ve6cCh4/RcL #ͅy9/,Xz<lF0Gь+x/ۻ| g1~tGuShܢm6sE9)t*W)^BQ΄7Axv0vUsZP# z瀤d #V&V;ꖀRHpXYX+( P I /Vi^H^Y:SɸPŰlcTu!kA^7ֳflgc,b$Х)Iٗj b*ӆRXP):tRx*T;Nxl%EGgR:.L}I 9{압{WPԊO~ҝ6>&ȊBV 8rJPڧ=E#XJH7n8:k&<ڛd8SnwoMg] |bO%GgJi"9$FOŠ$bhMqlnPsIxGǢ|͖ȖʧE[ӳG,HF^b#."ebe *fL>llٯ*Qv4[!jl-Ab987MO EI@v2KH&J6AxJ%jN)I6.©$@\2[ZC^E2CPِ19&Mqo\SHHQd_4J P/]6 ҬI)*C4*d9u M/ 3 rkW7@`̂RIW(JisP)[o/8l!*^DdVKʭj0wc1%B'GI!\4`cn}x'umjT L7?ߗ Z{$gZzn3@fYُL(LZo eK8\i cM1{h"1[H' Lh,IQYJLFe&=QٯrCLk;8W)PL8i;P:: t/h4 Wn\)qP;h"< &-=]Ð;)Jԭ'(a( l m-6LC9A'iu)`Ů&Oq͹]M;j+ג(hV!DAAEך8n((J)[6zTVhUYi'$C&dRɐǠ 87YEȩNծ&w0EE"6r9he:I87:zD!lDh=Xj-cHII6k"s3>2Th C OK#=sE8J=d:%].67@PNR 8#me =r2)2AGWmvIxεPWa L ΏP{Q6N2O !_(RZT!fAEN&.q:X 7|tE\GWv芔4qY!ot &:>r0N%)sgb'%SO:[CMHy\ݳ9v\7p 7ޱ=tfdLaxvPDS9-,|2_'a1ȐQG.ρXth .8:u1K9a! ٗ9aɝ<㏦[Ӯ4}}~Rxc0JEJ B~`x9N *C3&1.@gyfQ0!Dc08Jwtb\r\;&G*tM2тŬ73)h lM ~iP3,t %n9HTdFKESgv4&wfn?u'8? ,}J4nM򗟬QQxkVI de*p!шC!T^5&FCTƦdƦ/;wƫ^3|(lȖ^:+291BtV, {%XHu'CF E r BYc I"_s!DY}JyװQ9Cg%D]8.QS=M%NK)i?[Í![*ޕt̻mpƑq,Yz0HDE?0QG,2yH@YrK~p>-~MďLgdOvϖfIzj9X.0KW ŕZ+0|\,Ƴ">N\Lqh.(q:8uWyNNtVFӃpov kgsZ' ]KUFJW旗#CuK糣ׁ-(f%̉2|l۵ `<z>ߓw?ސڒč-Iu{K'ᛛAue&mLIoh(U|2}]ps6Vꦾjae_e##-HiXmrʭϧi0Ƀb6NySeTN1p!p\օ8냞2_}9jA@bAUsY$1 ^5ܓġ4!銢gh]RK)fg4=6. ѲYdSrTٜ%>ݩrq<&GœƟ; J 548J+ `p4Ƚ^_7=c|<\7ۊmz=jK?yᆳ+XuSg+<^"K\R56Pw-W}. RBTr*f*VK5IwH)t~׆Y+3OgemvnDfcCA-MELo+GB5Z =$؟)Zg]$)M|{T=5wg`)+IΟ_̔mkSᔅ8DmPlEV$Qi 1iJ@FU,tQ3k%r 6#JR\+]f%S g' DeO8w{d&ԡ}6+}<#S{Oqy{z+|y<:yhº%&TTܕ粴1 %H#xM*wl7=]j~Qf + ~ mᘗ !'+|6S{v0u'_V9P3>iBoTqXw~|_b_$mK:s uQ+毃^-Kxb zK4NUi|TOi}* *ZY't$r \Jc]#-X$Q9l畷UAro55 uuSp>`-K/{WGJv1=>/[Zvo0U%lա%$>8aIRfGWYtH"6]vBFKJ{Dp5Ьuz=S:-Ev}uTmRNA^P^Ik=z^2HިV\.k%LZP(Vq9DhMJp9G%D=J$5FqLB"6Ϧ*~sl&NBߑ;.ŻjPaDMh5Ĝ8FAC[*g UF&`rMs`%56շahgS<૖a,a|X?- I͑Ų>5a΄lo6V잭W[_~Z{ƍďtu`!whB'n~ۮqWC~\Mtd$VO׫wo/bgӛ59o(+'Y9mwޟvx+g倝7VևVL}}،[l!eXz\CKnI\̻OZ২Fws %_~ذ4}H#%sKuBP'[nrZ&H%rzxrC^$-Ez{vgqÅėzx3բ W&"s肛I}SAW+(uɤSP`w|dҞ;(|wW狳e:r3hOJ՝3p`砣&j#^'nAe~ɇ%rjʿn޺ӗ nБ&D5rsu!FOxj&kb$Sh]fU(2LLJ$oYkc< O\{Mkv uxA+..Ζ xhBs@ &KA[: PXBp^TO践]ͨD7nuRwڛ(SV[zPu"Б/ި7TAP!$ )R)QUZj!kSJ \,XN:o< vXC-;YwTg[[Cc18ӌ"& I{C"*vf7~Ɠ0XFe{))$NJ>PLκsv+0+r5 z,j[7 LQSk0 '0hGKn@WrJPϬ@^{At6'מerEƶ/|)^YW(Ģ0V7O~ҍxFXJ#`-r0ʹѺ P&DoXQȁ;pV&B/82a3#)~|1]V*RVqON$5jZet1lD\\\#""27yЋS1n*rk|nqy}Rc)O\KιEgEhRj)^mJ|^.Sɗ_/~I\]Zk7iV_>/ڲU+s$Gm[(?//?oϿcUҹ-:4sru.ܺ~Ͽz6lqI8T}Z_$s3[\/YUƨC @NDE& U}a0dr#@DgC {҇Ƃy cۜ9;M|{rܤP "A5"N܀c[rP~x37ZwT:\n:oD+Ii^@1v^KZ*Kc(y w}k!UE%"ȿnʅMҩ(v 8@]!(L9 I|qIp'/ >]mUǷrf5H.m`<4w([U{Kɟ]Qjj (UAĄbSzr:qq;gc9d90Wb$LBMfZ-Skg HFn2}'KyT^q$ΩNXxI7=旟[xKŹ[p"-o/-U\)v6W==jĐEe-?kFak"}Z lh4`9ʡdTJ;#v7q#v婠v78;;=P{0W]^fZ_ uth=(G\u!m5au>pBx \TL^`|c.$Nu ܻl8pv]0 "vӏctFD3q 2=ך2Bu cآUJh6ɑtD(`m|WD$Jy#/΂ꃆ gb mE $8hLL~~(b=뤊9{iɱh;" \7-P-1pC 04-daY)z< \|\<Ϲv3 lAyP ~ccS$dFR3%>?~|ƚ{ U[*hpU)P4 X&UQْooraylv;qvK]ߣǙ0Fu26r%K(r5,$#%CF'A_}7ЎD\' c9F`~c9Fragc9F`c9F`O#01s#0c9F`c9MJ6ABGc9F`c9!c9F`|c9F`1 e7u KӤ x~G7ǀsؿT3ݝ_\EBqBqBɧ'Jn K[ N'e9HZhw-qWYڞl^)"9,^}QT/PbE*03D&cL8B,m [ɚٝψe/=8GCWJ#s{RxO UQ$(X㧜|1ڰy|\:&^pS,0u2n]}_뇇3Lʗe~ոчՇ;wc_(}/_}~!([8wes4?chףݴ{AMO֏i2:ƗQBhV6L_toe'Oq'_VZ獯pːG:Zigחu% ܫE&osl|ODm4U6qVaJ6i ?w%t7<[ ;|J cjR(\*tkd9(kBδrcTjO==i3ӓOE.c?+WRVUlYyTIkN!5pi ^)#e63;R3gO{,{5~E125b!2yK579r 1fWkUl^)ZI%ǔ\I7UJp()&y]Lj 3gσ>:=!Ms`kb׳[_[|~;fw\z9 qej:SOMDhU09w&zhΫَ޷ٝtko/ZNKoۺ+?4}W~r?^_oy(=,.tDžW̞F}O>z-g]sQwm~e|vJq{yDG>u>'.Szm)MNλ-Rj<5ʊ>2AMX!TֹW946FcJ7*s5el<6}ꔿ9/>Ni\# xsj|qW_Lkb^l24uҥQтrK`ZPDD)% GA˧WdѪ30.PL}IFWpЪq E0@rTs(P lSr>:U*qR^f`+ן6t@-˕oxУ2xG8Xz[n޷Rf]dĔ*g<׷wz6ByhW=PF Sl!ړUa-E1TJrd(Gϔg teO #?WFw0q]ɺ^X# ܻol::OM0|ROR М9QOfSja?FNv"R4Z]B|7r_}/);{0c⭳pR)x6U|޶[nSTKP9l .[=җ:S"p{E>oqElnY-~L^]͎lS-33~> q юħGIWps{m9ֳe#^n蠽"Lڔ㒗,;ċ}g,|DG#י_/'\D_>L(\?9z\=bЙ7kLn[kԳv+\wƂ95T, 爗E̦wʎՏj%S!c&ndsV3owʩCt2FdsgmJ$f3l5v!MF1[wZAEs5:c>%K9L3B)z %BbFŨi|j`z(&ЁZY ALjgKRΔ+' 5;-9:̜~@EiNaQmEYem{y7u^t(*WMX85e!1Q5¶؂ՄJ4ؖxJi)^ A/F"쳷t{qtIY 5qaX83aXxA3 >dΌ7,3YmE)^^^ݽsVtS(K7#@+6lUfq7MctC{heRPM#.;ӡm|hԉ\FaFt}UISAaq(j`Ԧ3jTH5M"Pi-+Ezի0XC.Aކl:'̢&0<8~yGKxMu;|xieG9goN(_I ~\OɰDrƪ4ELiA'm3i7R-YJ 9>O6G7'Adڳy cw:@o)͜7j}MƟϛM- j'xۏ?O]s!O~k`2)D'feT}eo3|Yg 0.=h!娧X׭?\\nЦ:RTYe,Ɠd}Y7#:zS1++DYq*>PEj&U?B)sRMJ14(sW+@:?2hWUyUi_^0w=>~ciYҔF6LHnTo?cqr<_$ż3y"}Zf^ *:m3,TJnm" %F2Y#Q͟#z]o>n4?՜ Y~(iVlEx3}wFDoŚg7D@8klv/k?,CW\u{K9fွnyB}V0. 1O6 4ptv-r׼u/s^<ȅٹC:7%c@8EKE'XJF(m#Ih_yגEM;6gO3L|T}\]\+s9n4Jj2\}KvS)Dktq}RT0٠S[я^A%SLLUBr.Q(mjbJEbt˭9U$C7EuMV<%_/ZkGj2IZ&/u|k&_j\6UDqVL$0) l%U\c{.X fƦ,}cR֭4™>bT⊓Oã ́{}jT.VR!gm8*tw(50 #GaSxUҚw-";3 _+_[w}\x! ;z]FlPK>}d坏tOuH )Z)qR u^Ob FLׂe%Xsb*ɑ ޶jȵ@)jE?rv:ƾH<́lC 5w"j8l>bcZį1G&ⅴ&k.ؑ*IHVŰdxHȜ`O*jiIŁ o4㉄y1gU\{\ xq%dR1 fr HɢZc :`hϒ1o +KI0o Le*:̗=(˽zd,#e0( cnU{52C@@CiЀ,`6݈yMm*S<}n:At:M;D|*\%;칈P j0ȱ3XdBg p.Vj!\|LEw&%89&eq+8ڳ2h5x5{()E}88KK F^>tePH/' aj"Z8$#eH nw ӰW0Kj>VZIp@gzq6p|/-TEb8`r2|]1F@TP]O(e0!` =Kg}g)_9yUW;}w=+hFJƨwִgJ{9_!`.)C|`Y`w<[ܖHH ߗERKr Yʌxe~q`<F: ^x1u@KqY=P ۬dpҫhU;BS"HU005n8&q 'acEF5%t_P4DL(2ӲB 僕EJôL燍U}4nh=ŋ ":t@2GM8ڂvΎd P?`yrx*ƝfE 8-&sJ uYI N;>lƷۇŹzOq'SaA[w56}m6F84=~.-ŨuԆHT&oPw@>P1B(k)de@;{G:&oC@EN1c ;fCd`: (N6G!cUmQ1H=kFV8-5*8t(Zi298ksl#: gh `?AjءvGycP)EF W;E!ܨ0̹+k-o' E*C A1֗gZ0]v -, fQ5*)8B*UA(U9u[zˬ E*:h,W039l@E>ON%Ta- \n F`ⰱ n?i1~_]L糲U&A˦)ԵEW@7]pD3 L[ vM;d8zo 5$Z\PUFCk `L98 hpohG3wm% xKdM:39\PnxВ.?(uP"˥P4LT@=BB*|Dz;`;ըz-2'ի [W*ϕ+E3X)oGu63 v*ĶB!D+j #E5YV5z8YX:h\cGܠ "e~8QcsJ6H0r#SH@jXf=r@FXMXK#\5%i\4jgˇ[HrZͦ~ovcgӣL\)$T}ۨ0i9-71- YL{a(bxW|ΰlCyHJۢmb֕Uư muٲϹyiC.yBE$A?_'u_  ;.o+"/?[?x n0Fpoq<գUJѶ[D|ٲW z4QA')7$o~cc|̶Al'`kŖ糣/ xxr>UO=ɡ])01ҫ<}fDfMr"w %̓y/ί||Qb+6eR ) DBOGZ+5K\zT/_n?=Y>.!NluPɛ!;T}!:'#ߚ}ϧge|=+x2.a1 =c։R"j28]Q. ԙ .X)n#ص1٣sW :]8=J;_[3O2 oNvn>6#{_k}+< qG2et,ЭyfɱX~XXBrln0Ht';^ t'l%\ư>9[&\8ʵujmup1鶉YN Ϫ{+rT8dq4snG=\8!u_^@\bY@?96k/p|t㔻p<\]y_}:>^:Bscqε Y"+V$ Fs̟Q:rCE: 8G3|# |Y7{wv|g+,Y{ܸ=.GSw7\(E Ůs{B&jQWPeL^?އ|[ۥ3aF(!u>d:$rEStWOTB>M .!׳/|P=Q [/!W25\g2sQۗT1.&uAT \s|)Q^Gs;]嗴9'p}.쥯wy> _ƺџ Ǝc7kX02Yn«τW c5Ykj5s-{֚Y|kgYvYخcklͻetvG 7[ͫ7D]u@0}Gd?nr:_.hM䣲P5s79z(%}[߻viQֻdf9?o2ocbgY媾'i2gGp-ey:Mi$>}c@om~7s"=e1:O2Q9xGjvNGjRwG;ۋ^E3,k,>NONz.|证:PFCQۿssxfS6U˺x(IQ8u磺p˺դݺUTb:FR"^ 1zcӳ^o}e2v**׮IGP>"H8.$_MZ.)XCHI—S.XRK ŽO9.˞p}O:.sU'=ѤmU'hE.;Sp+_,u彎0!\/* Q2k!st?AK"=|7hLSԊ_6Eg56]DtrWߺ-?t˥;3$-8֬. U!ؤkJCs"Uh3dLH΄|9_qC?{H6C؇>;'be{6e;:RRq4 6t3 ƀlʪRLeWcp3dǯ)늲eХ[f-tR9+(+=&%ƝI[hKm/p[ϵ5H>VMטY(B!D?0"mS#)%ˑp-yPK\\Xy \#mZ#m75w~KTt 8w20T,?\pcHb&:,0G0J&`mPFVA/۟?|·i~ԗ Gd&ևC@ٛeJqpm\L$_LdOw;,pa,Mߵ:\/2-M܆kYA8Jn,?ǐ0͏E8SH?m]4B>\-?hҰR swE\A!c|ykg9G}+Up{̆kͯϧG>5hR9ᷣٳٮiC诪7k4m^{%_de823)mѸOS,xt|xГkOm7ƫ tMjeW:)fБYs/<(fxnOI &f__}~~\Ͽ~/ &~zV>B[ǧqiWtť@}i7u~kzݽS9˭_R~v(m8cq%-je bI-~&u;Y&kW5_ĆץJIM_AdiG6lOV/G?[$񽭆}N"0 & #!dws0.ZF@<*-Ճϣb[@rs;ڰ.]>cs:8oLJbbAВA9ƓE$1RXoI\eo19vSPgG.߫c@C5q-v3:tY&pBATGB0;Gsz< ɫT< Vy~)ޫe di1xp 4,H2yJWy#-MsiipӳCHLv-i}mpQȍ{`tp% D?7;^KܐE7N˻wP#Aв' e٠V5Oj~ҍf_Kc(h@fiiN)^k#-! pc8_5FZmxme Xł=!fNYr:T$x=de8hUR2i[V袣=嫶D|U^;eLDf ShC"AD4GfDC~ zS%r_ yvc|r69 hHCBHEBBˍvL6qLxP|_Qu 7|^)%%xG(㮤; :P9=sZLJd{Nw5*xRR6I@KdfIr ;͹d#@ROp^ZK?E,`e9=ն@-&2"g`N-ȑŸZ(8wx/Jwx9h:oQ4F#qBLfd'4z!>i{gJg)2k)0ÀFBJ $iRBNB!*P>3LA7ٜf&"m]e)JY\X qp6dO^7d8)S՞R]Hh)-K$ \T<-YFct JK 푎ጞ!еw]?$Z jm@t@CVL=Yk/pٟ @ZIs̸<+)3 zZ0eg^:wVQdK)cF,9+6:f`3UVW p41%U*O#fxs:(Ҿy:rx\NE滠"t);c^m͠ : yI;R!\c(O*!t\,Sqʜ֪n7g< rZ?g1ݵ*aoPBWtP՜q}-g$b3-cyϏ_ݏl3rfl! wމ#H)O;łG,-=(gIc&S+*y9P5"Y81eHB kR6sr9^Sڪc6x)O{iin}vxxkRЮ}Ä-}CkO"uEdc1|}鴟sc r.hW!6)}Sw o>u_%ȈF_Fwۧ91rmL>~BnZwv5';+;̼2rz(T[|=suϻyBۻ]!ٯ1)FEVzKri.n1t ]󜙶jzQV(TwSέC 4zn-2D8Om^&(Q#%@)+Bú`4Eњv8^=uwcsUj'O!xMϠ~dgC{iCbQZƢPj)miJ2UYB 1lV0KBB2Y/0 o($rd+G&cd,@N?/MiB5rMh=N㶣CBN'O,S-fBEoDw$^]9tNφsU[ Xy"u)xd4&힗<̖<c1[y1ŽN]0G5W×ŗgׇ7ۊY"|4F;yP:C)aLkd΅*8qgkV U8x+.C>9$ F.q6)飔*AY J/,<  &oqWV`ߺ NOi6CZ׌YƵWl];oeQM6Qz\7RusP='iVXǽ =E3~LHhRr%3]Rv*mi[a׻+f ݟ,~-z&ba,&//uϊvV#,[l00ў8RL 4 #"jpǁ+piK_!&R_VFi^UQQ14 {);Iy ")wumK~f$X>,OA-HuEيjղHiȶf9=UBs%.G΁͜Z>drv.n~*{~e_Pj6`uE*J5x@P8D?)a0Jx ky-)pj0w9UkDZ-%Q"s dثؗZ HFnG~\v-pncGd\@ix2~yǓ+Glenb>2̄,Zgy X@+dZ0SپG, rlшش`B&d_E) u5Έ͜''yIb jwmQ[wFm=`-R]DVl<[T+kQm%W3x^,r C) EQ 5[Jx\TBS %(%/peDTg~eрqWl3"GD|FǵHU:Ų1fjUJx61H@-Jne"b XX*zmSAV $Jڀ53"v3g="^|*2Qpqߥn^-.Bg\#.6ZP-6ɳ%1JRGE0,3ʹwlvx ֱ]^<UeJ_*: {S FM{k¾4lւƘJcc7*ْSɢ9:>ꐥMhz5d q~,yt4x۝2{P:U7؅]N߾tS\5`UC8r] 쌷TC~wz'sľz=8pQr06`qI,8zk,فMcH|v~|?;=ބ /qeV\vYv0AWDM>~ap!Cx^;(:&b!"r4h\2y]'_Y͕g-*a1V#ВVN; %jѤMqd\YtdDZ'1SP(ͺX<W!dTmN*a.jH^lNXoЦyV?oѫ;aj+{M-FM bMicdsY \2*jbc(oHZOf3WMf+f%QAEH{W` {W\0WZ J \!3zyC3ݾUӮU/o<"5\-)n.]ygx +{9`Fr?:йR?~Yr=(< J(hG_7y``Z 6 L7kiEp2"-4iZ}9?<Ლ Orqkbs/'ܨ˅˙,^LNʌ|O&t1}*h2 uR1usC\௳鷓ԢN`ztxz*`/n|u ԋ [;?ޅ48TSV!xC"2J0n{" D$`̵\K26k\۬QŹVb`*Af}f-|hr\p~Ep6^I/6f%:vpgw-ON?/ޢ#r"֖M#]5ocv69>.g{wOXZ},i&l]p_`QnFz ,e1sBrB[8؂)r5hf]rPC`}vEn1bE(}29 `*5X; w?nv~x%\+r(w9OOKBInLlj?ϓ/Q땏3˒YeH\˰ $:I[t3gv;Ě>rX&4Ȝ3/}mbhSV||j73"?>rE6)^h\OJ +XY &Q50^1rEp@߫1>.;,$ag%ܲ4+)MMXUvʨŨP8k_`Eh TbJf6SRe =gL֪XŐUjܓS-FmȲV\5CzW@UZ1m_ΊuJus-Ϫ7?_m*6gL QܮιHEgEhRR(k)(FWdlة>Ͻ^ݒUf ά{ЏԬ)m5#"!a,hPAȂVH6lr+uuÊ;@ ZJ  A/%v1%\Qv'%e+[DJJlXH[W :{KcrBǀE!]yjR5d-rCTÖ@[ZUaED$3*tFltg͜<5 0}[ZeNɒG -2Vp.YHCZskT|"dj::2{EH䂗Q)$LLZ'}5eΆ-'R]z㑒ENJTE06G1r]Pl%r2Nf&g[5"dS|Ca9c)LEIk_d|@T%2+s %[@eq5sg]1b'r@[ئB,.(\.([m` YϜۡ;+i򵜍%bMװ_зmZtlpo ʇ{_a1T8o$s22Ǚ\+p_q RFO` H(5ڡK&B]EZL9/t6t[QB :6\UZMМzwfzb%x6y%NO[Z~G3 . Z\V߻7z&`M\I;REYW*h֓ VX"欣0oWk~53"v5f~;7g7 f|,9B kmsrQ+򙗦&W\:r2?uOku⼋8 \Aӊ̖H f *Y++fe66OjHq%M[EV%ߗyr{pTʤ|gAU/:TN6a^Ȍ,)OM|檡L.NVd6->v;,n{O4rյ,o,jF_}||ַ--O_4ޠZw'ۑi x|6%;˒[4\zEc>Xȱ:@Ra.gb?FX]{kqd]<")^t1dEbVLFgC okVu^Z,=⪇yz0D?߷Ef ߶ej ?RzD.Q}mn;n>w nV"`}m-b-..;HΤbɤ@Ѥ:5Yy6mnw7m:O,hZVgI']☼m5J&2a6%UѝNAjk/KAW )C)9¬ (hErjk`9Pbnl9ͯ"p"̥Zv~{X1d<.FmENǍ78ՔV޷y<4c&ٞļ5›zrz>;2ƦF>htlpM=|v#3'&I=}a9zvO/dWǬ ݜPa{?\lp#-}x=7yaۛm|fܗ]s"Ong}=*U?58tM?+pG_ʽIfBr_f)р ؿ&}78*7iS$MOk`A ]Q+cVJ%Z0똭sq$2=q6m b0U"D:eE9"iDQM Ekz~^D/"GXZ&)֣Aw<(%+B&Bm`*Q<,ZB 6S3QL-=Sg}}9qJ1y?X?CA}-_xNu;|h)oL>c>Ғpc̱L.Q&r )̤6#DM:(GD "d]>e[ G%wp$no 0.!)A8&Itp7 ]y~l-<<\O\azD]9>[z"$wX<\ThA|$dHjm4GqQ Lp^9"׆ɘ{sDU7 Hy4d=wy=/5Bϸͥq!ufױLZ 0%i5zqmaW=ii?1, +V.>4c5n장=*NcE?#sJWfu@5;.,VAVi:N+w'oft:YOF=.0#>e3dz gQ3d7 wߴ?ғ|#.?nIDFYͻMrw#IXv.g'P}:YiUշNjpQv$0A%7)5%6YNum\߫ ћ]mG=Qr+)Vnϛ7Q矎6>'*}xw) a}i"g͞eʷ4y,[=H|_pǞt9_g(*ae`xM>e~6|97[&,x9XYfэcs3:K$̤RLN`I4 H˙SHwp%,^N(h,|}}3!x ͮ@#u BTx΅.:)FHp[8BaN9K>25ЧӺ'_1k1X󾒭X$\\c?GA^xX7 0i1m +CV* ]&tW򬩯gz,KT)6:PkL,R1>J_;R>VI"MQ9[B2Loqe^2U!8 1mfUWd@F˳=v&۰L%8 uC1%VQ+bj01@ }jҢ5  p$"%oIBSM& ob N *5g0!1j:E%DIUS:/Pji :&uC[u {Sc=(OrV6{Ta]v=F[k#(+wJlp qp\BPm$LÔ!Ái&b@ҫMv㻑䙫k}Z2ڣ eLN4a>НَLki[n-5cϿ Ewɫywp)rOo"0]'ɁW$qq}'I+ޯ 4)5 E+"%k믹}>k,mWے/;zos?gqIk/]v~"/o -pP sռb`ۯ%,۪s%̮ߗ8A;^ntr^ih}R6V7RT$dG )o8CtA- 6t 8`2 jcj!][;%5 0]Kwr5.|reƣfY╣^Zs*8P"=K\1o(xe+#urJ=.bA$]Ԃῼ3&(/|tTȬW:g8W @:O^~^.19}{[. y\g/R1CzDVh}!cQBʿ f>I.&]D @b D!j㭭0t@f|Szc'Yʳ | 3BH1xKQuB~2tBT[CSĚ~O*aS+cOzԮ8mA,zG{Xsl'udt33!Dee֑0N%А_T%<۵6F \h\8$,8$QȤx MP:㡘8[Z-(2'%<,xdjܪb=k=+!gMA7XFWx.* SfAn2P% IzeVGE_!) NAGWM. ?2'q!'4"x 9gA$I!IHǏF*vz"9'Sg54ZR-2d!%*A.FKO+\`-I}E{r֟gx/Y{4x;~J;j.~/s Oھ$c5g}ҨSf8BD6 jކ'ga4N.ɩV'iݜ4382ߠHZQ \=WjO4ĻJm?dz*Ukω 1TW Mui[ih۫Oq/_fk5F͉*r1OM\9q M?Ҩ﫵{_]ug#غ~lUbv0RR1[[nnvIT'㟮fmT o&3 yL}ӰiN,pӆ8+uݵa2,sx1ws.no־{U\{m򋌸#m_T'Sjb5IU=դ{ߟ*=*ԉu ו^:76^}2o>58EIJ>Y_$OGL S#@=<5tzޡqF/V9_tqn\4fQ}v^^h3ɰ9UXJյMC%Dj|9/}_B/b%TB5^/R3|FJrڢ m"y7%!:-5R8Ͻ h t!zJT<4u :* F4B>g<.jrIw|SA,;Z}Q_xݻ$jࣖQK7U/$q%=m [n3^ԳDMsubՂP-E0a1)y{P9sŃ>O%,>\E)UOXݦ(b_HDYW3.!l㤪9Sw_zC_w95_AҽƗlP+T{PkGJ(F&riY oTx-,q$Ny9'nD{-İ:g){gS@Yld+Bb4i`+PLV2#S튤m!1VAͼ=\{ ٳ+9#KLvTc=Id'`&P-e1}_u7ɖDJْDͪ#{a<>/h\n9H|k?q< &9.I "AxN.HB>\ppþ^ (A!kG0u<97( 8(J"+U]Ov iJ6^x<}@mv؇šꡛk9yN%[2L8F7 J1x2:#LX$]V ]NcF1ZleF͝Q`ڀmT "p'Ej:jcp^+`OFY\RxuK{Wev|˒!V9hԲ,8/C'~jHD&$wZD怜Vˈ` 4&d=>wrLȭzLȭ4oB"\D")0y0$Jn.R!T02L9|Uܬ(S)8hq;b5Kgq0n9Aצ잛)8\>-TŹF fM G?rI. ҘηSUW55vF,o+1}U!Pf[&UT%Մ{!$B+bEt+Jpuk jJt JbIY+H{Lh5ҕHNn Wŧ즫yuS7; ?VrHNXx.!l2z> ;͐2%èDwnR岼͍dϦfjuP P۲0=#>L|qaQ^Δ\qXt_G0ZꞀ4YfxC tn"xlzj|>*'k.ล~ݛ#S<`s-:2oM>\79}V2\/ڱ_L]yMNâkdm8VF`RGSJmΌN7| DcrU뇑bd7p*1ւ#3QDWsdZ:aܦlWf f fb-X' Uq%ZؾUB]6^a/BW Kգ thR]=R{DW զt5@W =&p"\NBW "NWt]"iZDWbCW .m}]B*ZDW j]\d[*{]%M]bTP[DWHJpuk вV]wHWjY sJZCW DW -NW :yDksϏlZx{V)#|`s۞裄V}1Ju>z:f6`pNpekbZ&z o88Q壏ΏxYzZJ_+v1a'Eް7=>4MKxX>J)jߔ7Çi-0_~-./nx?w翻Sa(C@EZo`H<>F,v5 Aƚ.z| p^|+߽9ϒOwiNƅowaVYdf쏜駴]=?ߪoT ]ʼnrj?dfa$/BVe~Lo W1!bZapf/{}_Ū!)eH,eH $1IA<1iug:tB_ճJ`Ztb !j N4°pa,>-ǑWf?/u66of[U^B2lQs\SŨ@dz6z iC/6 _8&K5Oٍ6Uog)'Z~)|_8+f+p^, Uoo^mI Gß.N`-:oXA+k*S5 UCWSދdև'ԋIyGOu}xm_뼑j]VA%ZuIs>MiF1Kq;8n\S9kYXoX>\o|O@GO}>?|ӓ99[8B;Ce'NߝUSUC}󪁠VW~X/0*{n!L˫C7u#d4C8H^J2ksHɋA6f~ mvUz<@鍿ܮDr/(DOi#fA[D^ Au{ c^m$'5zFeڈhIhdT0yл1&*͙Uc(jiQ`$=a.y|GC钬~^5`: [lf'[n]EGƖ) +]dE:CPE4OYovYT5 I̋f?OfK&Ւ$i*h֟Kx]?ͮ8G\?/A˪l`~ZXt8+K٪HS#.?v_?Avk}N| tzt@l$aȭMCH{"[6G4`86 -ሞ&1T)b+t@HA3^ wE%KЊ XIknP(0TE|xjǎ B!KPQ<  me7 \|T_7: ר/ *'N^f7,u97ҫ:mf㛷~$s7sx F9WZpoh ]ojmB9[L*Vi-DAz)YZIٚZWҖZ8[}?vr7o߽;JadfaM6Վ{>a%Mr|iR>uCx^If2DOى?~lgPAY mnǏExM4GZVm633'a(Z?l۴drϒ8MIC~ހCKn"'Jќdsk,QV8d>8DvppΤmSƇwvimv.{L".JP.ZT$u L JTGsL@.G )r[1Wbx;X+\$JH!<th'o{I6<'H۬KO)y3ބo2qDۘA\BxWATѦTh| Zj@>& F|<  jr \rDȵQW 6hlm`n7p>7J=e0єp\,%\yQsi<x&,q XKb5 vh5Z`(3*hslrD`;A ,ix7R d4u!g]_OQ(56EaTcW,˗,9]K30:y`\# :+X(ioEۗg.fi X1KJHeiQ_/f].JgΩ0,g\Ks(r,]7ٹNQw &_z FG/ ?pٴU;on?S1_pMa,5Wʢl>4Z1[?3wJ^ݝ\ʋ3р;]p"K怜Vˈ0U>֧r>jn|[XҼa"I#!QrKup?{H qv{Jv.s"}91f0tOϯ?ʲ R1 vUJIIBId|,AslBieKd!I61QBFWdP5!26(m:9J }v;#4K|HrM!^ɝ˛#4qt4$y yiu3t(C Ṡu #hpp9fɻ,A{.c}Hl^bТhKu.d_bVD5emDHCEJ?&gqH] /;Q!ѡ&V%VVhR*0&),C.cMbB#%^ kV[1E,B<0?Ð42H!Y@DdO$00ɌƆ˜"4QzS)Z+ɤZYVNv#-)ֻ }wfjƆ;;B xwWk"ۜ|<ݭ}{Zo;xf2Uۇ9jmy|lC"2;WoވEh @!Xt:^3e76gvyuF3hzF,ʨ$ w"F4h%$ѧ୎>z!OS:I{UqƖh=ɧƤ=9A[YZcv炡d: neEgp+q`Knyy5ZKC\IRh͚h3y IC 30~^vPT*M˟y1^<@!nh֭NM[hrO5W*Cˁ/Rul4M-xLMy+`/=Ndž {O!52"J0FA,Z .ƭz*^;gYc@&DKZ'|C`7RhFo˾i`n+I@:&Md4FM4!%hs$fS}p$FѫX7TCfgC\->dC\x(OԱEe; [ZmEZ{Vz5{}^k $L[Hs "0J&A3SGRmQ*hP%(:crT k1d(3Uf3N5Z[Fov jwrV󁭿YxMY~mnU~5}gСϹE_qvҫ-~|P]otέhǀ9QHѠTىXkƩ6NfN @:5QF<1H̺CLv-Yɉ(B@$Q"9i Tۨr}3!Yb`a}= bTJr\Zz֌׫?>6jͼ]ک$-gKB}?*ū{o"8K MU!ɿ~/ vߢfk/[t&E[2,reGJ3jbLg$=BfP+^MF?in3lV,H J@`l Jc**5ygkQ I;62ZĞ #]K$&+XrSTI6Ldxq-&X8]Dy}+(_3Sh|zugESrQ8^bO\wY&ƪYƠbv #fKΎrգz܀..~v4}C6F/O =$C~͡w/@fɭܸ7ʍ|NvK7JI 3ujJ3Y HDVMJWz˶]vM))n1R&'mE9X) 'hMR)Dj5.Kŀ)N`"#]nrWV歖/_g`UHPNJ6ƩَSzkêHAVS.!^-]qẗ́F"H$ B BGp4m"\W^RfeT AkC C(KBLk;;_xAISkJb?_(2ϯ3J !Mi֤啄!9m Mʄ!EoJb  [ J(9zP#P QA$YnUroq5:$S"$^әt@ ᢑ>XMٱ-::T g)=Qut2zh}D~ok@ڪ 16@-<?iB``.,CAǞV9)q*McG+nӞd*ML΃uT`$!rE%: f 2bfHШ ˊQV5,nmQRvQ:þbCU 6#mLP?T #稜 $.G]vGO; yC^٧6z8޿ـoG罺ojd.g_Φ}O ,< eV.|POOL~εXOG ŨtqHvd:d#v$BI!>?nybByV9cbJ߾*ܭ+;ۍ5U M]v^J29m+Gۄ SL3ӑ} .PΖq>zSwwr ק|~`zsP׶:aC oJ@7z_'|x-$E)/v c8!#hJbp/S>g X>+ttzf$rtQ%(Dk$#or, ,!kR,E>g٢@@9`)Rl MɈRR WK7oΎgCq1;uӳZ?6Cz:xw;w xZc~v٭G{u+jn!\w'=iD䋫J2,@wwAmob/bGWԞG_GwћηBƷҺ;Mۉ/;P^Y.g VW7;Hϼ̳[ ['_go^-\{ enDs_s~wk"2WںRĶKP=um77q/SZ4Ex'Om DZ.I;S\-bqcNݺx?׭beBEOLj ڑbK)XJ9vMeo䗊}Ԝie!X bR )E-Y'>`0'ȉWat먝Զq6 ұd h[ʤVl;M{6cg~%'5dX/ǝkI39ٳk:*iѵp†fO@`Ί*g#) #!`#8c hjV:tM-_R*!Rђ5OJĺW! ֖;;ToU7k%{B۸-|Q[xg7"Q;?j2~M&B~Y EXk2Zbd)Pdhu{t1 ;{m RaKb,>i "cσ1=l~]4L~LQToTOg>=9FN`9Fד9<x[c3Ƿ6$Y~Pr #-0g㩈,!PA2ZY6|m:I5QZXr^`+M2zTӆ8OBjrж tvwu$o$]{::k}EagX}EagXhj]: ۫v@cnȒT"0js (b w~]<[PqYNRU`c3~Ȓȭ4u瑻K;#xE x{\{2Ďid%n¼W|} o=NrtxcYbwyGy+ 3DKV@e<ɢ I"g't."ِz( ! 7~KX&wނ4b/A'9m ={|G,zzA:'d}ZB B/Kp*Jdbmr0[ܛNu[5]Df@Zl]XR R^ Hm's4%0Z)z#JQX>es)$Rډ(-,Q}|{YݢJOFEDw??9Qݱ|<ǟ[[*':u?.zrlxuU%ouqZtN{qsO9Rl>=z ˠcy$E碎m_ dz88as?/?ן/~Rï+ΟxyQ8g_܅^5xBЪino4;Z|ﶻƌ㞒!.!k_~ 'X.JLO`tXOu<ȋpy}/i4fĆvy黙x. f#j1=ISʭ{*xZ4x&BA+f)WfPɡ΄/Z3xf1-g.l/9e@򝛮LkpU%`,=o.x\؞4bFu @w&ԥT' LE2~AHm+to!b( v AK"3e@UBg 2mY#K"E`B{37ҽ7M2/.zOT/[I;^gWZ'Hv8Q.:q0ʥJri6r\jƴ/yC>uwq>HM$psem\Mɵ.#y@ijNkM 4rD-TƑT&ANhkQ W^$9[2qwZ(brP~w۞9zfc3RiL?M>|Fm$ie!+)mv" €g^UL1g2Mg0iK !HD`1B %LPA(=CNGTEl tQoE;XjPBqY6D@$,Šdd- C&V&`J6#df~yB 2g}?qxZׅ|76_Nz.7UsSfcbz#*-H<$vo(@0L*U#Yr<Ȳ8 ~S2\mjP3PcC4K 1&SIBGp4>OdfʏE.l" ~(p"ad2: 晔QY^IШD朿\Ec BXV*o=$P"Vi(_ AD_b,AQfɓ;rgϫ=W#%RFi "C84'Bt<.(3ٜ>FͱKIpos>J}Yoh;vX[;a]y(k_>]~#5 b&::fVKZIxSlOV-X `v4QM>+:9/Y,Q ,Ion lf0bxy_pGհ@9$Hm.}o2x҂ -#-fk@p{e],O&wHnxw%bW۞߰R&2PDNhU'KFe]kHoniJJ7>?F1HkGHF!--PִZnցM46tk!^f Qv$M %BWU`o ްpCZ%b;12̘$siML9b=j7oOn;ή[cnQݤ~݆vX{#7sήG} R8_v!pd@.)DF,z {*@Kѩf(Q%bBI@HMaa gyhMKmyHlQ D9SwmH_i.mY$` >mdg߯j=hIXrzdlvuPD DQ69k5A.^ҵ9[ΰM8MLuy_9kۛSj=l4Z%[t9ߒ~2_LF׺|TlnvL>݊xJg[INTV76+Bν^9|:^{l]azu5k{dC6o}~6e-[v>yxsM5-z^irOH1oh!mI, 1{uCM Gn3 WFƻ}ϛnEp26DLoi" ꌿ+[gm!T* dj]erԺ@kBLW;uFŏGwcZo)WK;4U JD k}T]H2v7ȇ:<9(J `Õ !yIZC)=ċIhJ^MzS:1"vnu0܉}?_o6=L0 $s,SITFfRL#dT\ Q");D8"d^n ԭ!pA E "(14 gPxK&@%)bJ:tE].G|!/5y2w5[lk(߮46' fGG= /iLKїezuR.9\VT ?6aV./swAkrKO.beDI]H6SyG@y~Ђ A͕yz .rܓDɒW( +CZ{5:bKjnMGgz\j>f'Ɩd#XUܟUkZ=!t̰UHhBu;?*![<53A$4f!aamnFԞ]& GLVy[{.:6a`Faؤ jfo8żSbhD8%NAv=YM=*X,V2 h'ΰd"uBKqV`9 $ΡbseiFCɑy߰}S 5w<cvNDGʝfD#qϹE'E oQ9p^@.I~I^zl\7Wù|S/pbr!£ǪNPFvI^'Iq(uj#o1¾ѩJG "֧H2-%AKRSol@I8ɹ;i+w"ݎc}Ip'"|XdG㤬'ST t:Ҁj1!oP)F@:(' elQl*tv7VotIU7Yx韀 R+d9 BZ|m-5຀ xEJ@*A 0%?PG3 K$JGy0/WS`iwOperlԳEaL͇*-=L_ +hKF0 !57:E qJy$9qv>r:@"΄y%kP&LS6҆&fx9U1T_j( X2>Why%ɤ*j ;K+% '\`'ZʗR$ C% Ʉ 3eqKid& !8ŀ[ClcQ D;MM:uARD D9b S !A*|`_-m}L1Gӱ+qHnyR TS<'2+rZt (x{۲Q fP@ HIJ%C+yp($qɵFp289[ܬ:8lQݯSN7~4Vo}F2i nLHUTGOݒ?S9)[$ j(&™[Q5A ^B,F!Q1n &Ҳm@Br&C%QD*$ZA$(jRJaTid,Ffd,);b,쉳k _yr3pz5:foOɟ5`0:Oo  &+2@,#Zy͉Z&16D3+$F^b\$pg#9/&;@LP m;Mb: D҅9ۍa<.va1I;4^@+-yB-SFuM%*ԛगI**I)Iޡ$BF͐ Ԗ< &DE=?X ģQ-5[xXǮ #"Cĵ^MIBU'r ΧU5lYp:C1rڢh4%Ў$P%X8ZD-jђfBqyQ^E\j8[?XR\+.¸;\pqm)C) &U„3kE|E-P3`ISPŃXΥcWpV,1rn,WzpJry_F}BGCןw8G.7O?8TSă2 )3qVڣ̯]?ԁJCVqp[>{.|$/)VE*R+'9^̴VyU*|9!yR`TTNJ*q*VJcRJi:+;RQİ+$Jv2pɅL6WH 7Wl˥gEkȕueF ^FWp:y)rɠe^vV''iz7vɨ W@8Q|Inv쌥?X{ i-2+*'2")XC:g8/!)Hd\N" EсJM,"=.'Sp>̐gOif0ko Q: @gz8_! 6l؜B_%(!)?;!)j$QlrZ}ꯪB \΂sE兕ƸDjzZ6G0d%I*XK/9q}`,?V*!_~8p߯t.Oz;f8DR.E-2VP&kS=eBժC=iB< ֛UH~(QƯ2ݏ,O3ᭊG@ˡ[":¨dEs5G DcDH(,@NDB^Xt MR`cJI`\qܡn(CDtvAHB/M\ :?JO[|pީx-k=k<|TSRZĔYmP. 2*:pGBީԢ#e.l9$8 ]sv Fv2-S%#UJҹ%mI\IݵW3o-IZƁDgcN,'JRt[ן12djH_{ρ 1̢\ƻЖBlQfRݻjY@7HΤ7nZ/MA`QωWjtQS˻Z[Ӭuk<Tild̕^b,vH]O;ۋx)HuH\ljZ=fìgdR>]̧߻ox9f;ٵ8R{"FZd,:q#y``TDZ5H,6:І` S4igqwHAqt~~~π?8{=j=b0kho>4 Ckb\l7Bfg"$dƾ~?n?,O/Ld06"vϺ ues\Opumd­b1GqM ($P2WR#٫袢ID\sS˥8NǬ682E}y̨>`ֱW;'ƋI㵱k jwEK9VK,*So Oe K^ WidEtD 7J9k5IV+%'WPRzJ prDI@O}ᘉsvPrSV!"&Y}\H QDpPxȔ}E@DphiFPtGz|ۺs@\>GXY`+R>m!υl5+]2' Z*@:)Ai UEq3Kιd4 JwH4RD$F ^xPnEfC1vy}.=ص_(my<)~?>68kp!&SwIt`#2@HӮ>ԝgLm3-ͷE\F @whpE&.(C,cVkqK9yWw?Wܷ]z=ꖋnټVZ|0oy^D8Wsŀ.[ ԴBCR2` }2H5tLlp_ཾ@!~YPY-q$O'=q4jQfcz+䥥5F*>)MHL>9B4Fy)6#6GHZQ[+Y 뙱1{f 2&xŽ57=`eW l]?.wwtfO=خ۽eزŌZvyxz^ixuyԐ+( ꞎgnOzGiΗ?ozh1 /}kWQDѐ(`|"rqԤ9^qQJ$eb#D"dZH'38?r&AĝA[57 SĔth"j ]tMyތ,]Jb=zF\9yb^z=TFGapUB*8O&T $ޏF07_iz#Jvm#GEȧ[ =;f3"؈,O2~֋YLrcK])SUo[7WoяW[ד9.أ85_e*x5is fNz_3qobzv§QOnZeWObkxFF# 󈓋_?srb6$zeIɑi5Y\ Vr]Mit:.G/>@xğׇ̻o/ ί翥_F#{8RqLyb^d]N<#ho-\g>q:tivZx!-&Y~&htW'cI ܝdZwCi)vA،6l_/=OE}³KKA|Fdoeq@ 1HFc3 A,j:cZ9|TFaӼɳ鑉<շ'lth5*i$8@RGPҴA&C$)/XQFPH>kzSw~v_ƭW{)g}/J(ղ -o^ֳ,; ::w!wv^ȴ!kA4/nhl8Ih#ڟmrZx@OSknEμQ˴|#5[OܓˇOTՉvO,ڧS DQZAgȉTPhH&ɁDkqܧzHh *Y&#;fwS|,B2:IB-, Bd'vQjMS@{ B!S{a>RRJ<8I 5Ύ~o͖{j͘ S %BZ8 ZkR*ūGo ` vp._?@N?۠cDw`r^&K"PpDA;6{eF^Ej5Zi6!NZ9?&lI:XH.蕰, [!5!XE%f m=dxx;e&]@CrOLRd1l'bPejIFOȞΧE[K0Rpo{I^q!וּHf$ST\)q\#=ڹNJ$w4}C6=n^MšpT/@Pz"Qe'AlLb/M'Eԯu2?箊ѫt~RWs}I=ݫ3NTdOOz:IÕB|/}h'wit7s#E?2WsoSUٗ+G,^/Bd_,gbċ={yۂL~擼z~%/z;ZT?./Xި廣y5fvu9-觿2\]9°W.7-~5>z|yqr`o}; lx.($; UAv֐GAN]Kc+GAX ]ШcDUh?k+@#X)UNfKJl=]p$4NF$dS@fkѲjWA2tFY"e%͆=N)/ZTGYSo8e6Z і\UW!ڤUiK c,Qs \ m~ UH쌊=xkFGD%I9jYrٴނj5ܧ ){JuFȆ hAfOIY^IQ,*L6,_/ VDmC-`STzP%y8e,6Kv`:qro ~a5:&S"f^" u"@pHu طaN.(31 >vD>vI&9|&UU<46I"+9QU砙OQ;UrRNc nx"$b:@@M0FIA)9"5c <]MhQl8;G!em "N"ep*9i:mJ 2 IvVc,\2/3c(T@V2J2$^i;" Hpv=e5a N|8?%ʱbl^+QonE۟/65]ҁ)ۛj yLd"eZA0kRXf5o[HKEKDC XK*#^)=e'IIi l8;<mUډ[YƞXh=`_ ]Uf!;pf_xy4͓??z|:}-)75>8!m0Ƞ2d)+mD4bxjY aS;;^ vBQLLp%QCcn6'y-<nfv+mtDQ׊&l V 9dl+΢)<pPfKIg$S 9&Q\lj:apa/ W`<Dlf"jj@Mb))idzrn`Ö r 22rH!Ǒґ5ZU*rTwRHZKTL͆Ooud\.[-Uq++Mc\.hl%CA  09PAh"Sff HI€ţ[YǾxa[{"kNǎ1 2QDpԏ&h~\ޛE.#O>?4=)ӝugA zds}Ճg=^Ѣ*dt5Iގ+>]FLo^_p[K-2*ѐCEK0T)44I`L&o~ìBïmi]#Vo,+n{[42/%P-8QB䠜ӹMRA D\&cH?$/ܜ܂/mfAD=Z2t:c"vkCDs''[k䫷C|V*>B4y#L]1: iJ *잩VɷV"5I!@B,:7KYdRgR!Nh5 (740!<2 Sxx~4cf=['Fǁ=;*XלMCj1ORه@@ix h@\#^?f$$!ˀs vF[4$I>F ĶiJŤBiƺ*IgMJ$YbOy.uZAlg-Fa0iW Lm痳)elsy| (&h tkY'Uew Ǐ-))L8CdY݉1Hy؃;ekG*xt@o苟yv;Z9٢ ‰_[AWm .>ͺ`׾ ݃y)2 7ETDYZVɤkert>Jk(' I ­a+ 8>hKLJOR.F/ڶB|jIݦ}Bg H/ȧۄ#[Toa30bBee+TgY%2h))ǿm&ڜ7np!/!,9ca۔w+3OjPn*h{7F7]jsFI/&WWkL}O$8ǟ?'&ד yu>T]}K߾8vq jo2zKR:PTF \\ tSz鏽r C7ﰛywxŬ tٙĞ48C EFD4ar<ʯ,xeEU0NեA*D!E$Y -OJ j"BdϪ/ le[OyO^ܳgR?3Ej~ Gb7 Ӌ35Ea$xq|4H9y@Av?Uv"Kٕ Ef>9$eIe"l;WEq xeJjy03j%jb/I]fve^N$iOΦʖ`mF(RJ1 J!9fȈ9i)H["ft!DMF@RPg`#+?1aL GY]rvDUQ vD+*%-'6 OF?"7 irLDq]bGt쯖Kʅ*kX5:t 7GxA1]Iz5X ]Cj(|Jl!+G%38Cm ]jC)K$)PH,+Ҏ09{t_-Gmw/>3n]]"r+il,2u%d {3?A2O;+a [ϭq9'.l7Ynjb [vU"vyxw=v[r5}x~:IxK|@O^b3kvݖKٻjFX\8ORls+;[ ZJP+Rr㹏ZRJ-uʻHs2~9מ?I²k5!yIUpuHE䨖I |0ޔvDKۓi59]n9Y1=?OUZBb9KH!2br>XA%NGC'jm4ƒh7<>A1xRZmwK Ifq8ȄZR_LWuh#!]D .n;,J(,m]셫qEy`"EɄ`ɤ`QŘ48#Ps\YYNp^)}ݠo{OhԲvU!q#xTL ЈoǃIFmJ^Oq)f3?|'?"o/6Y9LnLdÒ{Ͼ+ 2_ '-w4/ρQ\%"W_O;0JpL&+fy>ԞiX#!:ѴN[LIXi9B5IAe6)):Jb?ejkGYj DɊ(SRMS%>눭VrsG!e'=|]2{AlbU78TWX ?nJȰa MݿsJ7KU'C3mNpR4W4j%ߍjO`#jZx7n2!~N'ZZSgt>6_/+ˣ}7bZMq< O|tg=49mm Ƹ,z to~0d߰Fuou~Utլ q䇟Wz"[aҹ,b?\.4sMn/믩wknhtOqk\f_tӦ]m&s8*A};:;Y+2߷8#L.h(4 ×=C7.;eOf3%Cۿb͏|hNr-٤B8˒ D`Μ"$iΔlNl84/r9?2'O< y: ÇgCH3Ti\L[ϹE'4Z!B8¨r@͠(́>U=_bq};5|+e]c.gPz,nJ9dCrIn! Onq.@" T^D+P621҄nɌi=4 !!cQ>zR#Z$Ix+d) XH cY˛h2Z A W)9E,!Hf>)9ǗS 4#9;rL腯pO/iZk_ť^|_6}ؒ.?uw&Y;ʛⲣjl$f(QIpeQG+!ROP@c GArIH($0& Gq Ag*fsRe@:CHPTr(4yg9&, w+YdbF)9 Y1rZ~{b֝na1WF!8Q(I䢩T@bP/ƃ4IK=+>.22S$hFHF!`V*A)J.0pQFq^)8jo$3>t) 37"9 JΥ !>Ay'SJJLj1UFS bģx,زH1HG19T:ha_!z{Mu܁E6\ZM6P߶35\]=>[\.6v976G55ԡ݋ꞎkj67 0FMXլ%T{9斸*w2Bn}u<~D<(?_qW,}n$vO{>/AIDtJJ( ɟƃn9F:vt%%9h*<,h#$pj`$EK=U2YI%hdfs2P* 8c#dHfu1rv~|5-Eu)<}>'l29'3op\r'F%Mٺ_D9s)DF+45^80`2V)XD#RY69&I1_2? D|J0Q1B4QpGI II`.Jٵ 8?:ڇP5bX#yI)1gz&G!9e1@p[Sʃfjedpq`}4 wʖ!*԰{edpBr ;"EUyu/чUŖàn/83pn8WW00&)x2D -+^9ΠR0X2 u7wWzй{f勁3JY3ZAb*q45Fs* DFj^M<(gASɛCO 6rX,*[bUd&n7 DeLJ+]$Q5YU @3v"ۢ h 0&Vz-(oy"F!̓7&~nzꌳ>Jx: ,&Rq]%BHUVgBq^1nܬѧjR_-;k##-isTzCI)yy3JH7RO  ^ԂOչLmdTƓT Zzi79/Rn>/ݪkJ@Oq2Z`aB!>'o"Z{h7&'OLCn%iF\myP\YYg<.v<ݼ`kmnoz"8VW^Q[2 : Y%SF"Ig$s@G3n3ֳ\eD5sH! q@Fl!LDH<a U[k4OaZem{0;|ìkyYsI>!3?=?x.:cA= -u-m*P+ChtJ]ਃ18J G GϑBgs,IrądB˙:˥ /M41!1BJ&fq9T"RHV'@Br`TΟX.9{znUTcKs|q`$b>JJٻ6,W2;شT%Xd8$̗B=%Bmy1}ou)%jJF"[f[έE9YhGS,/3^Q=G̶; 'T HEJߒh> D3.9biM;xZӓ}5n&(]+NNrl'U3n{};)KQ %2J ~ `%ddOd5&Ie"U"BHΑBg N ȻUj6 P`BbhƵ(! H4A 2θ;Fyġvr:v*x[Z).qw⃶h (74QV fM|Uu9TBGR(Vpqg,綴cSF9Nem5D! N`eTDqJJk7@J8scsu5)]^ $BzF:J yQW?͝yZCgj&g+(jo7'u®G?Y }ľ5g*+xjj'^'8/*(Ynp#WZ3Wę7W1|7%_Auf]DsV[W _^gnnvqɫav:.~?:*OZ9k jQrw&ףcl8Ǣq=<xs׾^FˣbJ.?zEh"~K3+!2X^ T+dyǏm~c>6%WV"T$%IjC b/Wpeam^Pef:RyJD%%1х䝥qu>cg(cV\G[s(zeqpczky>7.zGhn-{-5ּX}qtJ3H/4w$zc!YyjPjRhN{Fp+N]RLoκUg2/װ õ^ގNox3mI6Vmj_m=j{6Am ̳m_z.YR(Ju(c#Xlnԓ>HzDl֮U*r~,{[+[虧̭ kBFGx.WO\ D #alPdv9Qs Hz51RHBc x,X@{_F"Fҥ)ڎoM5 칞i4'm8+eIzĥ:&ǃԛX ?Zy׉ώe#ĄNA)gZؠ]'U`$Ӂ2W%ADRl|Iw Pg.hϴLkɓ +!F<Έhq"DeZǭR(Tܳ8IwueբЈ uh$IF$#ut݉I>$ "9'Sg5PR-2d!%*A=.FKOK\`%rjgyYw5=.'~SXϻJ~Ϳ5>jRy1g_eqt88#y~Vk Wpʫ-.'Cqː{_!)!&h~$;"gZq|?qt0M96V%ol̕_DC_\nztRs5(&Y|JՇy4!b}v.Mqe+JN:n8H>;k݅MΞ!T:Q׉wtQݐ(VizwQߞj_wՅ'N=HI\K7Hyݷ,vi ?_OHTyޖjǴ{[.aolfYY q0 {0b^ŇͬNj6{C/ghc<%ZmzW5Y8s KTG>?za*2lCZ s?PxB~XoP_.|/GoO~x6|oߟR';9}{\q=0>mAgg 3?N~~BVެiǛb|in56ᮌʞ/}"sCoi,=au~ZL|2lBf_Aq]־q᧏h *vq[?*!Č8$ZeOKB x$asEIb!DnJBى-5R8Ͻ hT:#wfRV.8x!Nx 1TitT, *Kj4|xT{],e/$r3]IܚNlxy 6GotiWuK⿤LQ`X ڗP\]RR݅ ~h2z%]R}(Vݞ\N%xyaQC-< )~b4ԷM\;(;!F=B~/qޟ8Dת{AlLzlb/׻ S+[J%‹r2'Lwp6/477x+6Q|_Lզ tb|aO%ΕEV NBi`VFÄ50! #Kcdqz9FRc|0M֤݇,Bq)WYZAwtp_s!볫' (\=M\_&-ׁ'IȎ\6zʩ +?VNOt+;gċ9rl>ELW'OlN^@S5N)Y2Gv~Cl/Rtc3 lFq¾`sֈ]f0a7Z[{J%/=X:s(L()caRŘu #*!P%D.'(R{ȋ3?V3LϵW4^M1 0-YOn|v={xYƚZg}H\4#^gTZՌZq8˟1I#uz qfKE hd?v%^dTNc23ãFFJ)M4Ǚ)}D"#EBvZɏ8~%Z]6[H֜`mqKgH/ℵ  1 *}`6-K6 jvF)@ :LީWRc6U DC*[^.7 }[a+ᷝC~~Oo4m;iA8s0H 6sW kTilHeYG_wsl[^Ek( IҸ- T:=ނ@:"`AwHk%K*)$mĈu^H+(" ,DNtV`OHhcT21 s"RHF'H󜫖][g]56NM!ܠW:d|yu0I*6Fٻ6r$t{H$fn3`q DE2$9`+lKT,mMU?VUcKҭ&\?z6OB |UfL܅O=s e.u&r/iIڊM%Ehe֏vȠߐw􅆆F742Y?=-xv` I~jȆF6Kn>3Ó/;̼1r??L'=7n ϓԭM&>-ډlQ,ܸ=U6׬Ю#`P $g$2$8IqxޑGMY+]t''ϤSm]G#@߹<3Ç/&t; vЈïvSTޮ,/mRw,@#t;6WaFXi$YR9 V0xd:{ ==|o#L2"Rx'a,yh@8J| Do H *O):Qzxd 5w @cy$֑cϼ> B_oTw~aV-vlWPCD}RT۲"YI_ENA'a9? n_G5&l;6&V8z* &Rίe7O fŶ.[+O(A2߯7brl/ve|7xv5-}2^$ڰ=W(A}-a^NhqU982M0iar6zSƫ-˿߭.?¤p;VS] +qw31.XpԆ/«f͗e '[E|>]c]86Sţ26P Wgo/ZIn{3u1fltmcgKRt`T*k2o,R72iEkq,:1}Dx0\h/~9` f)KB7K,pu-x5KἍ՝'Zj9ֵ|-F8eыV%~(Jz9lGt+Xz= CwԴu价'irޫz58T'Np{9bbnU?#WH#NuHʋ%ovbYuNL ==r݉t%o_/er*muJ!쐒 Mm`ee1Gx1Mq5WN]ʆa_=5zj6X/2.o<7벽<+˳R Nd*"KΣ9E׌%p.$ǥ|vfB«g]!;PҤ^xXK72J #r\n6B4~ɛ&466Fe!lY!r6kA[`EpI{T/UE9?<@B{ ѵ1,Jٔ`L@ #eCUjVz;{@U2J*VEtJ:ђem=pzW'|ɮjO؇5.t2" EĐS2{., g*s0KQ,@m/m<"$ ĒЃUc#X]UbBI1ǞC,wJ)o&IG\&ɮ"^(_j-iWS.d.(/9* 80י"k⁖3+dH`WSsXO  "< * +x]\ QetB6iy@eœ7g!:l. L1 ɕo]-ZxHB9CjMW5 $)KJ$;=_-D4Cbl1!Hᓂ١]Mҁf4T=dX*U87 o)/6!+ɈّՍYENA:/*189ɘ`>iH!Y$.HnyRԹ wAɲ_ݓxrVQ=B'wlgjVOfUTޚENFjIC<3]@gI-# ^ʴ?;Ť".59=#!M $Q`U&fED%5'Ms%1HFjtx/ZMWZ3BU9|=~EuUьw,NL{>g׳0v/b@h8̈́d7i&FGnD-/s4YR*ԍ Y2c` ` ZF#\4l;f]:IsHc+#v5r#ㆉy.];E֊kKi+5Yi90 +ঔqѫ2F(Ά48*f O*$ 4C.I[ aML,h G(:>fY :Rx DZ(*#GĽW\+9DgY&U'zo#͇8ZZ " %biV_i#NJwYH 胕IEK64 |YÈCG g%gWKEYFIjp5%?0K't-G&_lZ!;hެS1\. !,6"U杍8we+N伌H5p_(%721J2<8i} V(HF<7+bAs W#e=s+w0)pWFƟeXEϞ>JI̶RݶYZۮdjg4nʪ^ R繂 ٻ6ndW6C~QUIU[>qa5E*$KN~gxQPU#q8>t7NF!\uƤBfѸ4SB #3\$F=juF=:u.c`~19UFE%Sz %Jcg0NAN71PsJ=Y0LDBy`>+QJPBd.3YFꂳ}NH J$jI^5B `w7sqv)uq|]QA7NcH;B.fkGa8(9^0H_Hi"T15D MZ\-Œ2w*yl`byFd0^֜ U.I.#%Ń?BTB'5ErNrO3 i <">,ۓH.p'rjgʜ^ oq7 F x0[.X5~y-L~Mr$1&3>5\:K,\&dA(@wɊ! &q=j0dLu6ڛL{q`g" ZmF'$f.b=&+" +[߿SOz_G.Đ8υcܸD' }?7pyy1JFVOAqA%O\a"}]rOqt1'?p;UU1[FJ`ݗ0?\crl+Nd8v;-nA~/ "]ے0[r}Kכa1|c3mQO08,_uҍ2pNnjuS_d d%LFcd'HGQyT^`N-_'s_k?i)6AP`<^CӦy@P7m׳_mw4KqFe/b"s.cRs',N`g{eGis0 v]'O?{ŦѠNAlVۅou((İwx3Yf8A 7_I.Gw D@H+v`#)K.'nJBx-`ɝ"3{/Hnkbކ9T9 d 4̈́!4訤q$0``8ô1x 0uj9ԙMS/6'L\=#23WhQK쑨Nc_:I m;N]:^#\,5~aJ$,*K(SkT]|>߅75|^qΣ! r\P32/!J,5j{+7MK}sg܌f w.u#t:^lדڢ|.34r& h$J 8e 9LXWjgxMff.(j]Rg&sVEKAϱ/.Vz0lE3 ՠ"H8b!y|ճ:F|pYkWw4mf,OdY2%<|-a{/Cj0!GRȲ4SQ1;`΄< )YMHIfBJR6#{*DgD+*˵̸>JLOqB$_<] JcNcQT>02͂pBqKr)+A^t9d]pSrTNgE$l nCa?⾀͛&fjrթ57ĝ!u{_VnAҌ\LX  ڰi`W2KM)Q;bH]e&LKv<̼y>oH?}Iyv7N9#k C0YhX\R3{;,&t*&BrQ@|&V,U<)xJc"; ɴ30};q2)rIchIE>&ӈپjrLb(i-S+,I-QsY/~er8QV=&J)2k*4Ĵt|=A>2Y:F*TxbLf6͙[LG>|7?gd&DQ}Ha6㳦.o0c t)a 5ƊkvO"Yc,Fg_G8v7[ʋ:{}OV.eb] }~ouˏs7/Q+(jD.3U9D~.Y2ya=.QkQ`vP.`1U^ovDD6*Cq,[</fe/#8In>EAE? /$o\zV!2*61a,@&TM^{tGsu_[+v+vLnv[iiserbx1R?L+.Wo~I.0cj\HK2 5 féd/,GXMnZZ9Vdrʩ7qfݢYe͗;|!lRي˾y[slS0]Mdi%5MG&w}#^ K}oaf=!jCZGi].W^f.?yLw7qS+ޕk~o=x"Xz^ΗZ}᤭L]xwXŨm_QyE-; GK)v0@];?1+[ccy,U{xyl/p)mG\6=^Q ˟aD1D'0` ϔ6')NW/p7j͡KZ>U3[|Gt*Tw,jѭJwU-p Xl ǚ'լE(] ] Rʚ!J;]u5e]NMb.ջTNg.WKDEuR`,e"!RrķغKƣihpfߑ)hK!.9ZΌl|{PMrf[-{2qÐ,ubv,&d u=`=8O8E?te8 zmҝu 4q?_x3zl2v!tb˻(El%dw(6Ν_E*wXklZ,=!pF5Dk>(i,y{\*K"\BWV;pB\ttlJn)z#V{zj'VvkOp& h'C@W}EOqW-+,k ]!\FBWT~wCYt'xEt 5++X[ jtBӮΑT5tpj ]!ZKNWRU"::D+&JCi ]\d[ c NWR7KGWOBW2&x kߕR ]]iLM[DWت-thY Q6@KGWOCWX̫(X=Pkƫ( g(ۼ-;5Mv꘬XyFnk_v-;SiՉ-Zv!-;-3Zd!`[c!\b!ZM@ɻ,,-+YYpnM9et(]=R[^S=v6w+OLWU'*вJӰ;jR]+zʅEt-i ]\{꣺rtBttutŸ6Di]!\kBW2tBwtutcV 5tp9i ]ZID QΑL}-+h{|Wt(+9::H&V=tpl ]!ZINWRΐdوw,h ]!\BW4w(+äQ d:$BY|")&+tER{L_17zMZ%u%'\\ xk|p~bct/4_o.I޵q$Xn d/&FP2HCJW=3|I|YYb*ž5N7qֻ5{Xz%! $sLR}pB}Ua ݺd$N.] 0H<..ތ,xG͘A[`ŗ8a4_|.XԆ|mmPdi!*y4$PДSt|LHp&8/ 6$d$4Qx xPJpB&ƅm`*Qf5`x& @"eؚ8b|A~6@Ep>LWw]ub˶ڨ҇~]Y6Ի,?^`eC#gFIJ AEFCBez֬f;JE-VEK(D.D+= G].L@5B4 ls؂g`x횿d^=s` *sF:tT]j}=4uO0YCᆫBnG!䎘 *]dH%B#}7(gӛjn<ÈBHѫ2Wp*":|bw!Ävټ<^#@!di?$TD#ZQx,i鸍>q D¨6 OF/ Uj0o$8.Pt1"`ͱ Ġxj&aS[gau,X0[~w0c7L^7@LW JE>]u}$݇#"g(!l:޹"0b Z@F.X͞Mx1|\]f|T!F.zֺGa2 fxV[# YSq?|Fl yRO  Z0w1QJGu q;.\=N/#XZ0@ u)ule XBE<7 Ղbo8 j]7rh\2Y4rۀisz>k`;iKpK϶D'v{(·bk@Yvu(Q@dz.^ޥBzPu֖v +O& W#F_4 ,AmMLI`CHs@PmҚ8iFP!ut"qTըZyz/6d?ʯ PiH$\$A?HZSN*Z n1c;mxLR՛Tv_/r :hn.ycgC̷rڑ&Rp]!ӎB*<Bw`oUSO~:nzM?M߯vU'FU__l;HHj Hk> +-xԽF?,'c3.s#A,QGv Y [ncQ|YYje^k߂Xø~6wG_~2)z0ϨKE=9N䅴Ż^q@LQ TBxb 'eOBg2U'-ӥ(@_Z=>J0(ȝ "D锨#,Jrk)tB;]{1gxaN0cm}e4C|ظ*I3lq+|e;UXUs.zQyeͪB,Y`ɢ7wj^>\ys5ܯ-V}>F1B:. bN~@18@zOjR5_=6d[j2_$fW_3w*u#Þб(˚OCYZO ,Jbbe'FkcIîTA&T$ )Rr㹏ZuEsr/d#ya?A~Oi JBE@@F ;N D/"G̵LR{R[բtJoTy>? ;O,vZ޷}2gz:ɘJKYH1B|=KH!2br>XA%˟#Ք8?=^"i; !rƧA(9o1nQ>"(h%&,~Ph p6@%1%,VG.l|ީH$ b4iQu3s=;]Oya:2,x  al /ij7V73wD Ik1ipFV8,'J8/1>X⌋sw~yLݖ^ŝ BwOo6?owHO.zyE^Co5BY:m[SݦV(PQ:݌fFm娜Osǣ:xXfAƗ5mqrE9h8P&qZ}M9F߅`>aD}Ȗ6j1rRi^DPX`XkzUY\H6N`l/'?W7oXw. "jufضw9): \)_/b=".2qDz/+(`ja4y5¿\ֹ5o|{I˿)Jyoi:݇,ly{Y-g$Oϰ/ Wy5G<gK35ߪ,A 1!tSlW:׿A^]tl/ BǞ Np{qz8< EXT*Ԇ!Jgb ( !|+$ )|IZ+G`;9&,eB`aPe1f}F-/$8ЊA5qa7׆?bNFTŵ[0sEh HRe+bʩ5e(3trkmdW6lLlv斌@QCV@Xo 4`mPNHV,Iֻ(TlWr*3!Y3qޮ lýCQk3VJJaڇk )U `{oGJhMM!?d):SE\\  Kņ.]$)[b /nCNuN: WLΆ>h4& ]t `0>劖6`m]0˗).ARgHaw, o6_l&Ζ;!{H->U 'úaUFJo%I"7VIi\!5Ve-1CZXB?YJ٫(cց b&DA6gD}Ǻ?~T:@3=&ˏ.Ώ7Aɢ㻻"elܠ7o1[vG_/妨EaQ%:0CP]I0k6gg6jXeہaTBlP* u`*.JsF s?2*[iƎXh':m4]8xCE;Y]Oxy:|odz˯҅&o4:+d=ۣ$:O grSF8mZh I A˰ #c KXطbꤝ-1#v3qGp>Mk.PPvڍZQ{dww+&`쥕(WVa4 vq)hA4ڂ xXES$23C-EXVeE2qd|H$٩['7~d~ԁqq1ҧVZ+.Ƹ#.xo'$]*L@(b1cN T!iX)qIpq0s+mc<4@=3s5r~)kY[$/MNُi:VEߧ|?~Fْb/V \kE1o11 ?]u2l~E"d:^w:9FN蠟HI)(k2e6Ih;$tGYP_*ƵRkp9!d[ +X۶|!/,N6%/C?WV89>p+A7[f_v`ہ)A<(jQ!Kqf :d׉-܂ edu^j%΂}75W읮)]+>Pj. _(v/ aLT.G°MklC Em',@$e:$r" eQ+)~u,d3LRV>Ph]`J-+X[|7nf*>XOj1Fq .؞#~L jo8bGT6gGKݫzDݼۣHBڦb-GL].b> 6LX14${P6q}fNm;orL UT)d@B45.T$h6D0 9KD|4rQOɖɓq( ,ɉ:g8[Wh:yN>?:X_nQi,\j!'d!`#O "&:ux?M.|"'`7g`EM9[}O<rui]:#3Ƽ5->=>D;?o|{uaubv0H\ s?{ ] L.2oYV74O=z&MϤg:t4O,?Ѧ<=`u?~Zs_uqNn<wuyκZ-|K'6>?롯fyr^&6停#˃㠖u'R{|.sE` w7KggRl#AR쿯;P;M"?&#=ؕgI"Y'L\G/;AZ/3aR18KL` \UqQ \j2\U)WZ zVzoJfpUtnWWu3[WUZ+WUJ^!\iR N \$Ϯ#\B2d=+X+j/pUvpU$3++? jgiA R|p(ڣ_˾XoBƾ\Pntfv-qOXe2ꔞ5|<4<凳I)ZQ@8-A#u;O1*۟Wʽ*͇:.{%Ϙ<:9=y|صo|, JeR+pg:_\oN\KY$9.t+5W?OdԥQ.ຢLfz:ӱ?.YIWJ;/[1=98*Y t;9JWd3;'ÓfX=NNJgINc]"LH@Vmcwȱ;r9v[p!NJ;rT;r9vC!cwȱ;r9vC?{ײ׍d#/{zћY731DhrL쎘St]*)W8lY 9x$!!!!!k:߰:d Ů_rvEؘ"RZ.>ϸ>VKYkn!o#Ӣ)9 o|;H6;wߖt&&|>8߉[8 ffJU sÛ)U1MVfڗbR{Ctr[vWt?w|iFXjsmN\}^ՆW{N:cЉ؝l/h%S-roBfuuTR-cܱbbM.MO\5IEN{Jm}ٓ ݺk$fS5AM&OyƠ" +ҳVѧ+#Mc|i{u߹yZ~~_sdw[k9Jƭv8Ӭjt"6- @rJ$eVI{i.&6"/F&G4s<Ӌ7_/oG6Qhb?}&²3ֽdD}{!M kݫ `XIa$gOmryNwTN\0Ś6 xwY`[JhBN⸟z/v{jxc/4㵓-g%&굚A\\|n}Iq6&}cqکz9wqxfJBV Q -n\* q-jRٙVty\/_kڹawS7E#d:8S]߶Uwp'B+Ks.o$Վ?z$')!!cwX_>WPO(NkH^%}c_Cx;7?|%y{!w&!;g^Fnף`qtU>W])nq x !A({gJ?p&k7av~3ppٚWmAYN+#H犸rinY8ݼ/מۊ?߹k~̙G'g ṋJz]˔QIŏ@|m(7c;T j>J)7_q:h~qeg|}&n? U.ꏛe褶߮ͅr]m5SKDMє,]*ã\vlNߟj)M%ZtH5mG):~!V-hj?:۲h$[ws  2_kQ!LƘlsA0⤛bcL*,4ZK9Y֤5xK&dskKU (5ЌcW+HaZtm9{ZG?RRZ!v6֚HQU-&-RfI1=g$\ݥ'c&f cw.Xx38fg }㋺k#KJjZ] 0~ h gL{R%d[c?g̘h{q{q07 0N-c0&?bB )ܲP1R{OkQ(9 3) &<0dٯñh&$jj` %sǟ %*Ei! vDhન-p4[el 9@g[)3LxlN3pƦmQ3Qi'իDFIVчL7(lcj "W1K,1LjuALYklwH!ۥYHkCvv id|Y)MTP׌ML!r r3l1ؽ:l 4sn|[geT7H`$TZe<*J,dozN Uڈn|Z֌E n(T4 ]Kmc]#^џMB.CD kxsGT ȆR̸1Tt,ڑ E[E.NuX<SilG?`kb7#똕3JjOhZ̠:|x ɓ]@NrBBEyjPiDQLJȼNeS*ZÌL׍1cA$(W&+'[x#3p𣇌S,TGwAkr-Wwl6ɪJeM Nb;n Əj.!Ok96u`DߘbЈ.my GЋ:"D ZsI@pa4B>`,9;5KpEh Dxcwp `ee1)klwX-eY3X6B ̓$LDƤ xd-.n"8TB֨%"X?zD;M"BÍ0å4,bXYY! RFNhbU9Ot{X;fH, N#&B5wed;CJV[5 W2TDNXPLSn%g |G(}xOގ,A0ĵU Fnl8<|c7;z~.{-eqW0GUQ.ny\jf]ז#܆`$0߯4egQGqku]\,RVޮm2fy)? zb}48T1c슱O Jd9wT`=@G@,=<rconbeB?|9a+C jswJkU!GHS'ySTxyŰahDR sC)4FJ#b$+bց4h5p?{ژ1|7?ECLm@24ԬV50yMv[(E+@&6(oUU}H%'/zoA^Kٍ)˺2޸2 ֫ţW=8uc ESV\,,΀|BX)n ip# ]Pplj99%S}VQReL2o꛹ųoW29jwWk t EL*zSŖݨ)vTJ {DxD|?u,{`x{P}}dpۜm5 @TX?ږKh:f1 yBeR#~ӱ9>joQm?8H8X9Dh3YQ2) q&-g]=By~4q% SZ\1=ybe+Eri>4W(c=噃<˴#w9~{Ī\}n>yv!0 Sg\i.,n s,yղ=3WlIde$$3h6*V~K<>( &BXHG?Яebj01mP/2j{v}}а Y^ċ=hBs5(L$!Yb)e4^(ˉ>Hp%y!jj{ck޷{Nk鋳iQ&u.$t#[BNx :k7¤D$;Q=?(XtV,)~['O)p1-x%?#v!Wt?ޗ2wrޟ\hz ;f6\Du6o;x:<Ͻnɼi;8F.4{4 9F)n0-<Ч_~k&]3F͟j{)|;} m/4.NXܠ ~3 jY 'G$]L>8%SћG?/D2 7yg<b=blVςfF~?`A`Fz # oJ@~5cXh.嬻4Mw@f7}J,n~-xۦ^Ynnt&Ϧ[y/~!M&eGhgͷIϚ,Cׅt(͆;HҁJ{]83'K.MǻpՌrfW*O~{韞y|ۺ(?ǵ1Mc^2#K't6,y}iü-4"\Sbڦ)cX31-u\ 0z{lnst6-5<&&hj~  DzebڮJܷ  v]x=VYp jdI;'5bƳKk|]VjE-i.0i]V2:wFk%^"eIJ4LP'.Ub/Bk ?9|9rD/uPi\GLL'/4Z!wsPI{ݔJJ4ݱ+ZG6h{9Y> )~ߋUyeu G廭-Q*'a#HԔ(5F"HX#k$bD5F"HX#k$bD5F"HX#k$bD5F"HX#k$bD5F"HX#k$bD5F"HX#k$~ZHRy>5aMDP'_#F8H5T|Pj׍IR58mbef`H)MhX_=j\}j>cY_֚52JGO&N4F|:SvQEpx9uEB/j:zѫ g{:zѡ'l ~jC{,Ʊ¯Қ̿~%Ktn\Q:کMmJPGʒ"%Ͳ4ђY1V B8Jk%+U` r= ĎEBMz &LuC֋uuѓk`U ̲옮j-1NU|pڽ@Jr:pڽ!ഖ;,SWQ$|H Q4}pvE_>"xV%\U3k~WX6w[7A?|[P'DbQMg.o9YͬYkdrY0Ts% {3Vե;TIw59%I1\R"?$xJ0=Qˆ$qC(秒-r2F\u i F$cOI=񐌣}՞{Ȗ1)аeY(62z!v I(cǗq 4Fvx#Y[0@^0;'BA1!@B0 ` !@B0 ` !@B0 ` !@B0 ` !@B0 ` !@B0N T1)i|oBzـ92e`V Q ~L Y rXis7O_ 2l6(ӈxvkM@{"فm#kfG]I4u}.*(ЫlՌd!* }<0 7u2JS049JA%llIVŃԈ6)֠%ž<f)8UnNs nؽ{Pju_v6ۅ  a>k2MV6-._g,=zŝi .* I *Cd__5iE2vc[Exg{]0fwӡ.V/&[H (kY+؎tEHZIt"="xΓtv H$ IH:@t$ IH:@t$ IH:@t$ IH:@t$ IH:@t@:xZ/5C5 KBBp,p!N{]&+#ƜPM 8gfLn_G3 #``w_Ce)\50/i|`T2m3ipV5`m@u`2!4R^Ce''cYZ2r >KNxz]^}#P/2|]wwi|yvYڇ:0rnCVv0Xp27!R|u)/Ӱ7v%z,Tz߆I mx5BFy+7/ݖ:?Kl$'Kgbřӿ?q9,uy}u mKF#|#VmG4?jhc"!m%^/.(ѪM"7"xmY9dzWVuW[:[%5)pYpuo v&m1LLȼ&TInµNfYT̻ٴ-v;h3%w};/4w> 6\I9 63ƳE6 ȧAN Zc|z~\ǜURb%Z; $DNP?򑥐m,Ѻ3Ui5iՆٝMu󻠽NnFubaorZEm Ro" 8En\UXxCEXm8;ԝL=vtJgVɾzQT֋"ō/Ur[N 3n4 N, U_V(z zd< }U 8~W؋ɲ76`Zm ڦMcm<m{ž {/؂qi.~Vou7ƫssi8y6oc5ߢZ+jb:򣸧c]Qt{a7FoT;Nz DMZFƻ=hlk!v]ηoMur}'Euw(xoK~zY7KѷF}v26P #fqJjcZ~)C(Cm3jttŸiQ+ jAbѣ5w] J)bZksSz(8utEP׈bbʲbtS 'b\RtEZuŔ:]PW9]uъZ#fU+Ӻ;LjBC2/m}x)Ww[WG3Y_t:Eʭ])WE,돯gW/<ʹ{4',t$ЕK 6rOU澸OwKx}-N EMiY q}8۰}׷K׶dH_V.T"叻饚 z=~ek^-.V79;{Wm|?;tmj0jC0룁=I1bHd_1゗c0-f?CN9l>Q ]ZֆuŔUܲRa&U pL4\;h_ziSJ-BW1AWUoStEZ1b\RtŴ`sSŋ+k8+vtŸ]ݵ֫uŔf+ZIAqDWL벏2YO+ƨpRtŴrSzSt5C]y+QbP]o]WLi+Z;+'b\gi1)cjЪQn\谢-SfsN?{bS?n˳~}:폪ۦR -X?R0{f-=ZJ0j=]-5ߺhڄ¼0A6;,mYkuk*hUj04-S-[ wmPG;UUݹwN7J3QE<fm*kmSA}n\zZ;<[4:g/lqALi}ȽsƔKlH*4VXL]ν]1m3uL*FWZmWZ-SO8 8VW^MD0(Ryq}ms^;銀qtŸ.J"+ISt5]磗+GrKW;DZ )]PW+Z9Jb]WD/[o%]0#FW^/EWLSb,A"`oqAKz2u} G1b fficQ:U9 EI;btŸrf{ڻ)cʯZ9$V M| b͵ygZ^ZLN$0?KJE 1XNJ14>c1+>9!z1]"wҗHWz˪KDuSoEI *KQڼNa:AWUNh/HW <$\]1DQ]GWSlGA"`+F6uEAF銀d\#3ȴgI,?Kat+4vŸQK}g)m,\ .J"`Q7)"Z]WL鋮+TƠ$]7Z74N"ҧT+TV+St5G]!щ٭(2qabrO޾A9D-HN0゘i11;eFtz=!EiZ ^^S;KZ͎A0Rj:%SoT܍]{) NκMbV0mtR饘-,6\WI~-in|Raui1w$]=vA"`'FWZ6uŔj2:Xc銁].L%e+D(Uz‰p)bZb uƛ '[@1b\4Rt<.rSvKAtL4銀pALtŴ)c(6lI0yr4\#FWL GWL] u;'ifbtŸN7t+, +tf9M|JV\i\_bYM|ۗ;h2Arq\֫[`k(gqc/=`J4%ƘazꊀbtŸNbI :w]WEWߏUo ÔԺJvZ]Ɖǒ((KG EWϭz *'HW ]1nPRtEו^\t5C]Q 銀+5Avi]]DW9:+E1uE^]QW+btŸ^KN =iWt5C]s* pF׊Ϩ]WLXt5C]yN8"\]1-uŔjBtɅrbtŸVLgi]3LefpBeLҕQ WFT\jXϴ=erC 6>-d-i%u֠uaܨ: AyjKv9j6M1P³UPZPd\3Dr1Uw6YDB+N"HѾHbJSv|?- ]`WL:Tnx_mL-]As^#u ]qIz1bڀ()*: ]p4ܨh)-]PW6!ݟ8 2nD)"Z*]%ڢ :-)bhpRtŴsS̼ rQE+i슁Q'EW.jS2v5G]NAb`oq1Hz)Cuq/]1z,'hq*ȑv$;"F%H@N'q]i &wM3eE34OO t.8Lt"'7*MOsAZA̳p3ô&eJ(ϳmYnI pN0NnVQ̶]VV$]AF)bZsSFWt5C] +vrtŸ^6@bʢYj{+qAI]1e4EW3h387 8"]1-f]1eEW3ԕ!T}qi!{]1/NI 2pbtE^bZ}g)} QWhW]1p@k~']1Ko]WL碫 y7A +WKc $hپAci:, nB;)4(4I*{&.1uGRuL,"3)%Ƞ%wƙ,`\RzgL/3'JTͰw(IW FκMƕ3ĴsS2oY~uv4Lziv=(!3]]Vk+HW A7]-ZҹʀR;'FW;ur4rQj틮f+ /HW ,3HhQuŔA]PW$]1p+ƍQV;2|REWѕs ]r:]јJt5G]yw'w tEQbZP)*С QOH btŸ^ ӆډҩa`x_hY:1F&toe/- +$ 80gVJDTΟc NnVbIbFp6f??=Qҋ+-ʣ5Ϗ?pAM]ډVQ0AWXtܪ}tExN=DtbJkf+1]1n])GEWѕ ֊x]WDWLu"+(OW 7ůgpT|N}K|՗/O>@|q~z}rWOW+^\%g}>)Y7 7ՏHn+訃}hMcu1 b4hQ!8h"];githp Gm.GzpuY˾]Mv]Y紱խuߩ 'pQnt !i zTv0aF[֣Azދ^]QK!7k/>vSzJ?Mq͖-m:uO.7eyT*PF7.sS_eRsyx,u=uϏ+zRM?rzw^_\=5˺X5:a_jW﨡>o{}/JuJMj.􊪲> ag-}}Z܅$m}r?ź s}QS)ξ|( CTWj3]%O\*^E{D7Ϯ%_r|5*q<ڿZT\*ɧ7Q:  =ܨ뒞^-}/ {ꁾ|*j\ @mAM+<9F G Ӿ[VfhkR+kCЕ:E֡ )KnL(g~c/>e7_uvA;M7M[WDM/n{c-mkjgo;2P;@?XK-?hu)ߪP׃{@tm닦n>f~ng}t?ȏ/Ljz{v 'oð7om6aٻ6cW # ady81Z)KR_wCRH%sfطj9/*$0o1Jƹ-Kq"*zCԉ4\KsJ̍RIEBa9#"3#Asf5PfJȱZo@vl`p٨V4A)Vn[;,5Q=tu|)tefb YnJ )Et!VJ1|BWy@!$#yG\ׇ@}W=keyAKXОF@dp@ #"\0U`wN"r-QD'+xkf8cP!R$G2k9ǁl,[=X( "q?_m:V>U_Opzi.tjdK+G|kr,*\泣dֹr*!1)}„=oim閶}ga2Cd7Q^CE1 ɡl#Z( }.%.R8%=ڠC^JHXÅXeQC[kΉXdmҒ~hI)4 tyɑ]* h(:^t!чˆײUEgU)zA(# l (ii\+bM™v*cSNV;W;k.EfcF'a Aaj\9* 4"9R% 0J;Wk"iNYR$BM=Kl !c W12"D06%0v"_b:iE"a 8lȖ>:+3DL)EeH@J W\]Ez\n)G!GBRw~8΃cԆҾe̙+CT45jc^ ,I)> Y=V˕wcaEVc5zYecVJ^_~◫oM/M =eoT]a71fpm.񋒖E6E{?1Lͣ0G}0zZ7#61Kb.>-ptMi`Žo,9Eo- %ڃ `\َ[~x+y_$( g8Kn5"moI~~mZ~{yrv=ڎHƭD%5+}ᤧewf.0ORwm~u⊖4yQox7D?̮n=f՘k?̃J Ho8rS&ᇷ$~T=i~<ޚ麩\ j6WRژz~,XVppuvs7*ZlZå/IZ~3MQi_hQ @d_^華H@O?~? |oF`- MM&Q 0 S_re4A9mz1c~5W z|aOCq};bϮW/b|ʴ9 5z,,h | zi[_]ـf|PN7t4\a:Ħ~ ?FH8fK;qDe,,PQ呮Ų5xw鹍 ɥ9z_Jȼ ^!$ X\k)5ăBD){|2"H 鴲3M#ŷvU'3&[vvaszga^xY__O[K z|GIy0 25s6AV4iF {b5ܻjU=zgukkwI*BpHt'399n"@a\Zsk^ d4սW_]&NQc/,kYSq5wΏ2_B@!!XR-dUJ,OC!A7\X'ψB8 {q23!D6M!K%&ALK 8]ű&VNPP't9M1#EGerڂ`)G֚sBv|N{ LӹeDŽ[8 ˺P.bsX/|`l 0vc(% UV'si>r VJU&A8m4UIe^*  ټPgz8d \Aisvջ+Bo]߲eƥ'Hy 9:rx<ğ *+]ݷU8 OJ>_aoFϕ3M1.iy7pyZھ/xT vVm[|Aε"©"5.[  #$V΃Ki^RYm}tx [J-Ē8HB\*)22 $;ygz KSX o)J'\^d %G!p^Hhx%8[ąvmt4ũNg4"g9ȈspzJ\j{/7-ǣA?ܞgElmqw:kjHzR*s8=YFk]x^2ɻ4{K,f~K~2z# ,s4)<%|=u7t 1Pb(L9h 3]bMv.[9)ӟ֫3MݯFHOtɐ /ut &M3lA[ApFk^4΅NH&iҦC~/kVIA{*~13íx4/~ϽKsܫ$/sZi{a>~'sZC.^>M[ػHLW4V yp$d!v /{0k9'DmPleV&QqBgFrc2Hy(0[;fk9iN(}?gYt*}T{b伎[ۣ݇w*>~ $u9 KLBʊC\ﲲ1\s=Yd7FNy[QFB%~#n2x?[(vOk11tpx9CU_?ZY%rd*1LGn /uzo_|u.;(*>L^Kb!BZItIaAZjnoR9(ՇQkrRW\l* Kyu}3`zN^rXXLAB4*E'mT1NAeQY5Hk냁;ֺ+qZj;O8Tחϋbkhf_m?xMhw Azʼ*'`z|'j+qV|G'i^ W8&D9ʢȕ!8uKߥ?iQX2`#CkHUDE x'+*Zr힎iRr)iz6>=͆*qYdN읕΂ ݲHk91g>7>^mç!fد|Ӟx4]'ZRHNg*=WTYv6쑢EƳLaŽyiӒzƮER?AC..zӇNz#Ϡf] =YѓBZO/)yd儒7<,%Csy˻9>"O[#WH0tV^\GOQljt]SO]f8`a>;)l0ΔLğ9gJQ`!?կ&5g}URI844OdgmC vm!VګULKQDf"㈑QEfjs0!%mY; 6߯Ζ6ͱiW2rHw ܦԂ%q*+W .F YT&yPL@ G403)ys ~^X|"N ƒZJ$RF V <ʰ2^ Þ. k/ w'wڃ^ts 6>Gx|BFt q ((LAP{PdW/aSƊN: ~ r]Zx f_I[+g`yQ%c/v "|(Y >c~^Xx F7C4,$ 1AdгdDp!Ʀp"ǢM`k>m?zOH,2!5$Y&&޲Y6RSi"*p/ED~3FaA-VP"t `F~޸Z-Kˮ B[ri^X}||%yy\ԟ7^3z:W82!mܝ[%nnYIG'sjc s0C&ŮmŲRVBeBpN.GjBl2&mQN:8.zfg ϧ<Q%[*HmlL$j9[= yS>a0`=x;#'`xn$_:AڒUl&UOH\P&罾 jgmj1cā]B})-z*.d'ـ `q'uQUQBlb6O) !p&ePREBs?>=y7xr ԮSN4'h H<_߭ cVJJN[|foqIqѿ)}ؽccޜ6|}&lmU.O,j~f3*άvp]]_N .7+z=K&N j9bAJ 61nnʿB~\%b]~x)_Ncq1Nj{R]l^xKeFx+!8o'c Yb a-h'.H5N\X2:}&3GoPN}1.tZlRMKd49<ɤ'hD}ރA50ӱo6tУnѶ8h풳^! FQB`(JkE i7(JHE9|J(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 Fգ\Y9\Y}>W7gSpe~~4b\}Woqz~{M:/ds }wJ(̗*\zgJ,VeR;}s;Ñ\NIpj朼(hYȚbHT 9* f|mRftEFU*)I":F}.jΗyY𰳗&G5D14̭XSl ,!F?\U%I  n8dW~f6)?t %Vp:8+ A2g*+'{pV`\BWLpw5^: d{m1Rv\HT_#A#ZݨV8q҈wsqZ_|1bT/ /r/U[OG볓}/@NfXDfILKRMR`Sjd'f՚ @?,b|TƧE_0܄wm$I4pYy;m4cyZ\ˤV7oDHQMYiJfeeFDFd,{8'$WM4x6$tNIA\Cʸz]MΥ.}>\|(A1UFH5iQ(GE]@";66(v548F"62|(ew(:\g烫Q_tp6e*:L =y u^ivO ׼DjZN^ :Opvзo) {!ww!hw}I ư-X&Rt%r Zbn,8m::*RR]"vōdfj Mc.SπeCU$Ɋe]<]<ȹtkUa{(0aKTX_m nȝ_5b= Zn |%b!EN4@@i vI|*.ie?5?C.tp=F%lÜ}IvysPf̖k \X/qjRGs9"IYZ"|h˧ΎlznJGo!~eGq1=uS٢]tI.? O!"(809Ij-[@A#*6d|)[DiAPl8 G_-z a'}'η*Shզ}yѧfQy] 6W-&ZfܾNAV\R1<Fӫ ( E pDMw˻uavǻ|{5-ZN4Gc4i?~;Fob嗷`gZ5Rj>{^9t,n6K/g~ӎN˄vsT h*<Dzb*vަm0U0r#.5oB"S/b TЂ;x;nt0GܖA:K G{58Js*7)M!tx?i0Ɨ<Kty iT#o=?;㹻P]]^ԴwgOms`p<8# 13܁w1yvxH܂gV{ Ҳ3hwSޮG0s11s2C S/͆-mG+A3)A=VO7oT>cO/%Sz?= M&N4okG| S+":K.w*y(^ -6N(#9][9ea=g8ah`OY'oWz^*LjҕZRwr {Kۻ^IpS\vSH:XM KrJTHYR W,IpE;XX5{4٩M^9ImenQ2yT1kE$QQp=u1M zVl8;YW}j8 e Sr4.9Q(IjYΉ">R$2s)IyaeU*?A;C/-kOXteD 2DI01T++$ȤNցDjsI5NZ*†(}).xvH*|8#$[ۘYRX+Velp1jLjo$ī=fow٦Hzm$}Or FdO᎒h9i. EYxɹTm+w"0 '5%=#d"(Q,{8'$WM4x6$tNIA\CYU=ʩkXvЗxz5ō5#hҴM(g@a[(#eh|!R`v>2o x5 $E TA" C%F/IY%᠐IX^$r8],# 6rI"GeYh܀Fc#cZY(ZZl8<81;xyxWX[0|gv5M n0 [դ>`n1Hf|5#ɠ',5D8&0]Ƅ>40i-+̂(VhI9pV5ܠOr L]brӏj{D$KըN.4:` < ߹S2Q7@u4s`[mmy "d0:&Qc{rp_rK3py f{dŰ ~&ob~6}G=x:>b8n{:µ}Zz;$iI ,^73tXL#(becɹߢn '9&yx#_-|ҹ/g8m_[nk3r C ѐ23WMu}.SFCeХQ." 5\eQw4WhΦM*ώt7T\dW89fX扔w%=]ҬwwfR=,"޾]/vW~Q- /K^s=܎;[^ݟyua<3ocuOx0xpwYzg{CSɶӛ'+΄37GTJSSF& ]GJ.R.6tEXƱS?&e7t ,n.y%>xe$ E#}`S  S֔}>O$F^+cde6 ۆ1@)*A$XR9aN<|Emje-|>,I@_$#Ĭ'aþ#f9@%,.|prk|ٰݲ/@qĽEՁ @J"Hy`#Xm me`zFTB5"G#2R4,00 ;֞YselUک귲'BXU߈nL*2r,y~I';~CGr2=0o"B%U#ͣiğ$66BU4wC%{P!Hf6u}I\an'I{[b.":l챛5gQZⵛYSvlz#ݩ|ILYZH:R5EapF} &b.igcAqa1Q 43dF-EE2K#(B*&Ȥ:;jYs_ Tx*}<#Q;3>5TJLEa1)uJ0aF)x>P9cN4GBX` |TrZgB ueNYsP2r򂵲E//~q'7Rddmi B'lv912&!G Q0VThC4x0UlWaǗ~B)ws>[^ ="yoUIN$:,Nt ![{ۇ=)hBQZ`Ci6ƻ@|\c|rRˠZaiSt. 0LAWZ[IZfBYFDE4y_'\!NM̅-m90B(umr ;NCP) ֪T D+dB \F?I^9?ط-h{ Bh:z{ ]q>vƼŎLֺt;BFFNMAe3Hx"*;hDCl*aQee A Fips,KdwN~v~yX 0ՀӭvKd4}Nl>^S+A2u`urK#}Egj5gjs{-gjkџzJ}?`-T9Klyӻ?MZHo`Fm6/r* D]]}cW2OT-3阳(lF)'6:xډLq3d5Fn<$1GjmxDEt UZgTm֜d=t᡿6[";L{g;`/Gψ_+eA65U:܎͐Fβی2}BMc;eIw:A<{ND#~(Լ(N >ǎ@ؖJƗe@eP-HsH%ʮ`7\\[rĦhE/ܜaL/?WI->LVk t*XDPq*=a!,@֪Nř@>S'VA(M18])iHjcၤ5$yp)*mYd+5Tesk\nټi,-чnyvYfD0sm=u27y$nH*vґs%ĉ sAFxR 8ۈŵ}gi6eIIJj,#XItN)"0<%K¹,SE 0ZUJŤ† 2s@9") sM)KOxEj}JYs$+ozBJ;-dƐGb 0yJ]SJ&cd %2ܫQB@z HDa>I6%Pb5]MșDPe lIcd}] *:oy:je84coꜻ~kru)eojy׿2Q7yuz~ ΤWf%rJ̠UGi&6k>۪ԹIHEʳqY3Lu~>L^ז4}B@f<1. jkŪ3e|8JZPMkoRN8[:ʟ74s";fOGS޼9^lVƽBBg񠻸e_>;SգZ$eP9`<___j܎B-?]O/?Xûϯ?_ށwo_~g?qLsIl[&74篨[U-fj oԋX#f_T^73d~|Q$]KYC𸟽^VS( W<ܮ6.'(b[on"_SiBPV7 &+?6^ 4+Ϗuk$ 7N~&#+8c9XGLH Ak`* yQ#=bҹt).8xI[TN"H6i$AI5cc*1ٻ߄~Oɞ$OnNEoM@OSm's-b 5MkwG6(XW=|^>F ̡<\E |_ ?Oy6N9݇W?7]G[Kmck vh_Ct"aL +p25(6Q{֦ MSi^R9eT?/o& _F\4PHGo;:M"tDF7G4R[T ei8"Xɡ њ0k$SGD ˹?:C/tB+扡Z[iw̛tQ;C'@mR|| A4ju6g GЃ7ry"ҹTa̶jNbvR"y~^wyV)Q$ЁrɁOQjA)DS!QC#(N:Nʶb|ɩɗ|[J.hė={bˀzhS={Qh>^_O폣$BJLD(dlr*l2hy4dMu=0X (+MLGCP0J,1D0*ª$n֜#QxZ_J8g+zfBgG2~ Gܧ (mY 6.-c8ӦF4)@"\΂I*&ђ,s,ZIA!iK 6_)$0Fw D`d- }Har+В&˞Sd|3m֜)-#Kp N(?H(LК6Ts;ӭЪ^ ]/_#`7jP# '10W n>7ƥJ#A繻1;cS__ yY ?t4Ylbϔ{7ˏ6J6\1`%HPt qH%o=8ߤa5BCsT*ԡL.`V\=qe7\z{D2PkW[<_iv'lGO#RWv qe[quߥrH\! s(*S龋L1ʙWXڬc\w5Vf`UUY'Xpp\.LSEJ,s>xB.\#RC,uSa+:} Y?MF𢘋~BAo's2E}r+:!ʉh^$)(PA 7]{Ew{묨+y,}酷`: ]rsoD.k9Isŏngr`ÀSQyuS6<2@2mw!Y&@Qx,'J 4_J':W*)BՑ;DO^k+Y"W1w_Nn6bo=n\փ-~Qf>6R5˵ DP&҃q8]B #Asڌ]q#a5ȚOJ:Jʹw$(Ot*ؐI 6nVKs(UoRI4u&9E V&rBG1zF 9\qSLLH\E.iDQba9H)S) ! M[s7O0kuπ3w锩x%5qK*u1jQZ6_k[/v(!28JT*ֱ< ĭ)[5½{X+N]3}=ƒqOP _O53Z H0D0Ns0,CQ[7ŗ2F"'NpolgΕ"0i5x9›hI 뤍a9{tIAzǙZX93*Ъ=e\mVvvGwzRbYn]3}&v``B(d N &L攱T ObHr(n(:~15vI(ެ W10i-y B =C h4^#!#2snp/ۖ sfV32+lso&A&20.)=^QgL@$?'7M -R:ŏw h֜(U$I{9JeJWNMkH jwp=( H4d"1%*< QhTӜ.0ʭz`RV_NY!iyٚ{u8p?)~Qgggf؉_ͧf/I'98wsR]qzWc?Nn՛9ApHYNkdqC 1F+gE O 8wP`d0Ƕ#jga4 {8/.}p?k'pR@x*/u rrRdJn T󼃠7et]8FV|)"Uj^M>ͧ**q[nW$_Za>ʺ|5+5x|LYqcLEwzyQ-s~,{OqM' WK{P$.WuVwcn7;d~%p,S~Lpg3Z78KW^'\몱*lއ %n!TdqLhѯyRy=GN~\P97w?~2s~_7<V$@:uMS]cQ@ݵ x]lӣȜ^~݋?v!߲}9_0yӫ*"N~d#]Nf&bl}_Mle#ȱ=dI.Tڡ,q\"nJ";@0$g$]hmv6LKYq}t82ʑ5jS\!RH $K dr ?Hr@ɩ.:ʢ#ϩg!x9ĝ9~n5ءCKh _"t`<#{.]S`8:Ӏ.rpF@<@.Y b2,y:f?k?OqHGVNQ602MAfg$d$4QÚoK d8 ]2|q{`[,!ztQDyU1xh I%R*2*,H 7, 6(f5eЪQwVp Π9d3 ը"Ƞ3+91rz3~VM,kVQƮh-',P$XJz0"2JL^,TT"b\ZCU㎰Js,ċ/u݈JT/ Tz/rVz/XH;T[x$o2e<^K`/"$=xD"1A9BH\)0:")~sjE694&F'Ec4R1,[Gᄌ>pJF E٠a-*؞,l<r~3{ǶͿn^o*0%5ȼcUĽҺI ViFi]$܅xCt ,k8 \X$.<>@$AG̵LRZ!Ci4 t;~= 8i^q߮VGut+wmchCD%M|VP4FMDZ !6~Qͭ`8<E"P$Z`B #l') ;Ak=|X.7+V8DM>\l}[Jv`"ZA>-ZGb*=@ӿ]V=yni0(eX D)fVź|.7=+~tGsaa DZ8,pU鋷?)u\ի]wqZu6wamdUוJLܞKzx\SXDZnzNF'AQdF#~?dܬJͅu0<ŁϾ4ilC6i[kAp2/$/sxNqF7URϮE57cxqљT/bGgBP@n77aـOˑqo%=,Sc@ƱMVOUl4[M}=Cm;VwK1ij:iLj~ܲG3nħG,`L?r`{ncډVqiLai?9U[|j"7>5 ONjzsbmsX[` o!wӟ^IYVLPjx/No픷Xi'&8m(QswFköGN݈RL Fzdgnӓ6ɆLrXd#4"ĽѱdCN {#(G͹P\ *<)^_rg YZ_9lH&.= dҤ 0&6H+p4)^)LƃwW 9(j#[ilO)/[zU$8J"CQ)jв>ӭw X*y`('c ,`_SumsCPꨄMV6 &nuްȔdFfVrB.F !\l3.2Fwy3M4F)BWhLR&z ƈDfPV^90+j6 -i942Оz˞JSӒ>*^_~-*mw_.mzlˇ;*4mJeAVe" ƟIIr&̂4[Ms4B% &NDHH$[.hciEZN ӔH#8 r⨌EVLB1cJ|YYYϪ gK=kWfA-'%>.mU{CplDL%PX3:/p&xeUU!ɿ|z ;$kρ,:A :1)P#d% ã#eu$rZՠV~'-_IeCI>s2;# QC,Z #-> pQp֫Z2㫽HvW{7#m$sfyY(Ӆ@8Q;ZJmIgjJ. -P|U>9?Sz1B`99Z{L<VM&!jK ^Mq#jҴMY/ ElH@iDI;Y%M'D c'9!RJH(=cIIt5Z%ZZ>OJdSz~i!%e 7#%rɂR(9Z9F"5愈>h_䁅L,'w򒃼Jc=BO !2Mg@ r$\f585_MiohzY OW yx+@B)-dP(1tr E@:3e& dX@6D(HsUSң ;t<$!X-OWk8[e܇ )kJ$?/(!g FLt2'IbvOZ6*6KJ4s R뉛'sE $t)1I^x$@mb@˖nuVoTz F3ntH:&xgA&'CBG.(n1Y,XAp@G4΅AH&)S{]7V[ V _I1LX]A_W~p樋&tu_2MEcz?\?iRwӳKIzpڷi~ủ֘vqQZ aE;2(ۻ]I,"_-;zEkA˵KMOt! n۽BMbZWҤk s"7^@"N/?Me:⟡v;%+8s>YjZ8*"DB& Iݕ*ZrKWK t#)E18Ef1Z9!JwV:=B_VΖm~ItUTtt퉀4ytj+^޺E#;̖qc~g 솮]xǗϕe8F;\;8As EVi/֓ι4Kk]Iu;rϕys.wnz|Ӹ祖a2o~u3j/[x~:r̈́79&Cp0}$Gɔ#e#%xΞ0N)ͬu$ʬh`{A/M &d,:) 9̂p6[l?K.Xjұծ`7BQJzl:kcDe BpKf'C&d ![ FǬ# T'j3Tf{:0E&ZDQ""q-r"uto zD!lDz22%LёW"ZV hD@PF:U(*[j٢|˻DK.NK VKJv]]ZHrrrR -3L@X&Ѱ L DF13]܋]<Ϲtj=Ի=/5Js\7d?>SJ}e?}MNM>8;?rnA6<2hlXGV===XmUjux~TK!2(A tV)K>& t[d.,Dr(G2sGX4y3Տ+\pEA,8 B/Q=ukqT%&T@l:pDi)O0֐- 'dy&h(̋!*H":r1H}R4g7_x4/ j{ͤזcAx׋vvCh@fU ̛BdX'Tbp,!!u Q@0=ml@o{IТibJJf_*f.5F7VYXn&߽ݺW'-ݰX* i[!}.),w<ǔ[LV[W,3Kkj}M -oÝUZl'u$)P_6̓4sI4JXU twV/3k RA5y^ZmvVk#{k'Ս\L\fHkmz/ 9 }] &dz&7a2{;h?ٻ6ndWXzJv3WaV:{qT WedHYsji e Z4$N8Lht_ 0>@H]nԚ>a{Ekh0׳Yk|Zm=8kGRO J {Jdɴ'A/^i{U׃'x: $2<$L}Jgr(]bg ~?+~C09J6}qA_̾P p~~ nhGkK7q))+(Bg$5غabMkt,[_,Öf\]v%h9 t1r [D MU8Y{ f6L:{'Q˔/S?/f+L37mmOa:HNAw|()~'9džWΎ&UW%mճCԭ:>r8g8?0kѥLDJm\ƈS1{%6}k_DypCgBP I(ξwIL>{]aM&@͐+ɪrvd]-&[Mu*zol1sDsRΡs?NTRܟCs6r0qfS~`l/By'L҈?\Z&]a M##Pē TF|9FS݈_)G }M(1aݢ5+:^YG/\QftAw0BJh% T5۶UmB"Oz:,Z:i F6OZ s8;|>}먺 C-NS)9`8D\s Qٹ;R΁W.jU"P37◵蚕+rtfq?Tp80R6@7.Cl(,׈2)1ǭ |{m9PT`fFގ_>cCu uXσ(N\ã2q򻩙x<޸1Yb〠٠e>>OV9fw _7]۟6ǫl~r##X#XLkZ޴^T{DZV1` (`P mϜʼ+'gV2c'J~-Esk*ftY<b$by F>*$2x՞JEĸdCQ{0rHRIpxaR[71E8v0)&8 \2*9|O)Ks@H0;eJc\/cg$ccs髽qLLpUgvT!iUg\!;plx 3-ˊSq$h2[ܝ  DA*ͱ>;k}1\#|1@`gi,UF\f🭌x뗦B0l.'4f };*P0vY4a#A2^/J$KʶZm]9pdY[} VX׶"n}q .e,㤲PIgup6LrILWkhy\hv7my-qK7e5>Xmc{-l7 5NӪuc66p[i0|Ð"Isթhru$k.^3KеL+k^fW1pV{tHbaJ{]+q/@͚?Z/3p#!ㅶt8p%΁(Nsi󠸣xiݥlZEK`IQÑ c) K). Jx\E3ÙHm A:o;c-! D佖FMFS9F$9#+l~1^/+F‚V_|٭RӱCs;"7P3 mvO.ܽ6)u&r'ik$Ehwשjv d?.&okZкj]Tww9tfw ͲChY@ˢލw^Gtr7?*f͟6c`!u<Ʈ^HԔ1тFQ4`pHFNGWL߿9(?0v?Nvkxn5U[{MOcDN}6!9GB%딢8s&Ck9AF[(="/l|FP'EC # 8cqdV{ Ri=qA1*_trud`;;73]O6uZSܕ݋#!"6WɅD!PJɢRHR0`'wV!A@cH286Lm:ԁe(MAwv-wl3# z6;hTowrÉ G"4=HbMY?,sOǕQ2})Ev?r(IGԓܖ'pӝLRO\7qg/Ə-a2WYkms`nS>8Gn8~ Mhxu?V Fk`|NK2x;/:O/&eya5ߠٷ8>k7dqv퍛P sA]f6Z-ſ)Sq;nդA =ĸ&㇒We+wf8FeuƲ@arTUܷ|B5A1e\#ZMqRb{Pd˭/ V{/9fs~5+b n SzӅW[|04>NLS:wwbj {W[gr`2>hݮjHunע3h쩣g7 }ot3ɼם09<PM3}uufƳGIAPjuqoP(oXQk%%h ädq,2ȠLc_p4|xw T*@ 僠Q2q!@ 0H+,8b>MDV.lF]ګW6!}yֺ;P+eQ/<{;kR OoD?Ӎh`T]LŒDėR{ŒD%}ŒWXj‰ʞӘ|Qq@Ku,XYE GB,aמᔩ bgʀ>WH:*"㩖4AĚ;g%Q$i䊁yl FB锅#G: %%c88/ LD+T4h110R9I1l3g#g͜0n]67_(o9K]{Jh2yң]ϏPr{GiS\t^iHF"2$!(8`bha{aLٍVSb`bGA<7,ecaAtJV!TXlH^rFXrW"Ȼ RjDsUT!H% M3Y6r:Y)W19`t++R<HP%8bFԔX+Bb [0N e,U#e"Uω_u~fc.󎧵t< BTB"M=нd8V6U=vRo'-#(@`$WF1-G5gZs!C Ezny"rāMOvHWk]@cDYKq18ŢDQkmF_K|Y4r&"$Xn>:˒O9wbacq3VEvUW`]V;Fm;U>'N|'C "$i sBMS$6hnd3\dK$eo_dŋCR ta* j5rY k]Il^תK/B1M.ƃMdQn]@xr%nx{uO dL_YMU,r2J+O̤@gHҏ #\dڟ-0jMn0&ǘg$ `̒$0 d}GY9愤82)ښ9kzX^$N-RBHm[ J 2"<?Zbz9 tohp8\c32M P;Gh"jG?{!%Mtl?}uĄyZd[%ѷ68זt&0xu|{5VF~\cT!%$O`06+"$iQ%ю2Aē^H=Ǡ|]R :c"*k x)~hdݺ9cg5 GxSGm)O/'@Nbyࣇ15tVbhc|-7YnbxMj3DF@CӶR;ժskIC |p>PoZslb;[RjARRWtA0|;ptsH%EKBq;frj?\IVdpHzF76[R<0yy``f漈(*ü(lʼnO֫ N$ZrUP|Z2q+W԰btپvKo>튠t}'7Oo~z_hrUI)2Wb't/9B2m6[YOUx[gDp^%w,hϋF@dp@ #B)G (#`pԭwN"r-;OF W]k(1QG bΣ`]WZ*;nt.6XJ2Fqi .؞#Ni= }vU6n=:z&tDVOعt:\>`vjocb_KIz`p}QV z'ԮZ?J`Ւ\ti&2JSp5rV87G,ϧIt^JfwwmwIcȫevYPVC[AW:joE6-E-XfGߙOFӹ=7Ҧ5н7 b:r|>Yz6 XynlK[sՔA1 sPKdB+2ZܐD$G"F]|{AܗcPĸ(Qҏ1>e) x ϶@R9RdG"a{^BO5=D|8?qDL)EeH@J WB m; EJr,yq$ }˘3WJgc^5,p-V<}ɕwXWg.9jf??c^q,i!-~pQG sߚc7&OR8v.wAhJ@X%40O?$SoIFc0-Grx8!P&&xy88cptꇗ)Ӄ4YrZsKޥR.L`*يxI6whV\]_.8Z0$]6gS"kn׺pvzv[JjVƒ!|j@*nj?OIwW0̍pڜw7ۋ'쎨>Sw7bYs>Fy8x<_[ivpGϺ2oV;LTHoIƑ>aaѷ,4wVbǣrWcN&gQ{$7iZv5iåI(>f6osG?G`x&ch~ cb?:z~w}owoͻ~v? RM` ME ೡ/k M-&ulY/渷zݝQ9cd)6~LC \ִ?{{e:(Ql+D6/16aMUa5|s\/Nx[ӕٌ:?O&o clՉ~:" l#)WdoH2e vsS(L>ek\m>lX*Ox[t)2/Wx4I:׺[2dR( &dD.TLl_uU-uk}Wm=;{_5@S5ƤK-2|(Cts0g:mݐօ޹BN4TU;/ZׁyeċG$^ i'N8:e,=c.(|АrL dOx,Sz2BprK)9vJy60SZJ? \7<QhTSiX_!/ӟ>@KkbA"钉 rg_ܹ;2!\yn.'d[w>;-Za>JBtȓQp9`\ Rİ"ؠIu4$ٓUý1,;\[KR9LC2G}>c -*mvj/⬯@',oX vba][dYjQxU-NDp8 ڐni#-Q.Z+4qoBƄ&z0!A Q{q23!D6M!K%&AL%J 8}K`Mue'虈P't9 EGqP;R7S|894[زL*) 8'e_(qvz-]O cqc+ؕk`ZI,ALW bwA qޏha9^^DhRAٰF=N }EhYlD Y`187;LC֛+3m.| v8jF !-+.yO1^kQ\JWͷqjNԔT8NfFL]A\6-r77M~n 1I ʇ _V47tayVuTċa^+\b56Qm*w޹ 5 I0/zg!WRPkW,TwꝆ1m+"Ũ+":ﹺ*ʽ/6\TWZ$\?6׭+ G͛54!Uق G lOom&HI9---3 #ň0 D^&/FMrz)jPk^M#Sj5iF+? Zhkon#0>͋BO::ƣTyټZ| `k/M_Y 诅ccgz9_ LIycw be~ȳE"eR[RKt%iRCtE❡׈L;]zץ> ])XUk;CW9v*(ҕF]`3tU{W`=]}@*]" lW誠=UA{tehRz\SdRJ+Ze~ےS/\}u3$__d=.^~:_:v:eWS痪WVy)s R>cQVe$3yxFg3?~3ZU[EJ* |F>h8-~8X:FY$rP%a*s}ܛde] `W#+3I01W!0yI+,l+:+p_uߘأ3B)X#Fg(4vjU dw j Z J՗;=?eاY-uyp%[zZF#,Ϡ+վCRF);CW]+UPr+.)+ ;tU**h{W8HWB2ݡتЕPXgVcRtJ*:DWRIfxgE"骠HWJ4]J+eN0X ]Z`GOW%#ҕ֠]`#e}݈־ 骠+JWX"u&^Њ2XP޻teEsG;O0~=v fX3˷J_tQuN?%Kw1gN'Oݨr㺎C{B\,'w=N瞈2lW;.ʹ_ӝq\bsƚTiaq7ױ0Rb,2Vٽ(n."~4?<IvG $n0f7MJW҉X9cceT m7{Iټyx,Xy?W@94l_n}&/buq5O#0M|È$"vD[hXPqNis?ՂRV~~%&L,a,Bs1:s `%g\di d%wWܭ{囹l[y*2rῦ+}Ό7A[cv-6 '1cx"i#@?=)JQe~5gkFURgK9,ZaK)ö_hxϳ lk:Gz;\+TrG7XvVe tյ.@9&IþKnm:V͡\oxf!{0g]!1D8̓91'3)UYcDThȊ&ܞ8!i\Hwxc&W[{=HŌ8N/6sV3=x^]Ӣ'ڑ=2M'3scd.Xe 4RS*KpRˈI?n+_k* ֮-}y$u`ZӰKs2Z)mHjyH:b$MNIS\7cUa|}w,њkвuGLLFO &UݤZ4Yyd*ccqf׍eףү_b$Fd#7WCLR]R9=ꬥ*9*,rFq>{&h ڍBY9YnbYgr}49"혮-8me rLb?DcIY ,jEL3GMR8 ٫HmU٤oh(]C7Ok!1lv$ϽQZrks)vYE FՀ vU+FUV#1g>j:rGɶIf4rrgVK%s#I 4=D:o6$t`=J>9(gO^n'[HH+˴(U9'Tq}2ߦia"P49{2;t 3L;|NwPk嫯N;gOh\6V"px+%c,6` 1L%0iTI02 b&Qk'-Gh蜜?.3O"(b<; .X'NuD^7 0}m)>|6Փ+n2Mafӳ?Ի>|kaw=UN)C=ÿf6yID-(*e *D"*]h3p:֖t7Fbo{{KȢ7udK|ӓA;dI/o6}|V;uZ3[\#eWN+h_÷N+(}[}Ӭ cv=\.ǦCr1kj?Q0g`vqr3r'|RVOhx)wi/i#_ީ2wNX "A>鿂VQ C`b[6\߭PKK$rP%a*s})2^׻+Vv_% )e*I$1ʘȦs%V"]FZZ5You|X#>- 50cq0z\Ժwu3g?/>|LT>,Thy<bo)xd26;zE>n4/Ui @`D٣K/ՑɑSd)хoFĤR*sˬ)E/AĆ2,F4ȣCPy]6JlF58;VϹ]x&wX n^p}+56.GmMMǃ;\}'[Wtvk~- ыYxwY-s6tsƮ 7BeT׮8BEvzJ0蟐_ZW֛M]پ/ƷY ںMۅ/*vhY&y˻>!3-KWѷU_/6za1J5_6uϚYy59m7I v({3rc75QÚW]ڨqѮ.CfLU^kU)akGE|Y͗$R.350>20ʱaB9'DL:hEN& 2 eaTVh^D%qF,S`}L.hPH3pI ld:0\frf58;f&&ޅz2)E}PyȔzlks8éZQdӗ Zڏ&LMsyX(SrI%Ks>D8s 0Kdgawjb=1|b XrRYz$ۏ $8fĈJ3)g,t{x虳2S"nMJj=QIW.~uvxqd3;MFiÌd!efg|'j=M%O;3.bO=kg<$K,c 'Y'9IhLF@٣qw݅aߔGq~'Qvw; +S Y X:=eO" .Cjr+ӣ1XklvW ǡ8C@=)j;%>l_jċ/=}HӸ$ A'˙ .%KkmHo9]}6,쉳ؗF_%BP_yED5-ʞ-r4]S]UuWRj Zor'%H.+H374nW=z8iCʃIva#I`Ǔ4i,OxMu64{@ft di㻹0޴]︭P C-fk(fQS2%f$e7-p&7Ine\{k2iמz#zۍ͚[`r#eF%qҼ-o+JRu~'h(O(l 8gmo1RerKg_f4~ׇؖ$*5iڳ_˟HB ߼pꖸ{WLlFi{`فEXWWa9.r2wՋ ,ҢfYeTڮʞE|;WESq}x 9E.Bofz^߽9ud;=xlDoy TѱdC*^3{#..w*2Txw`x_8 op !4a6!KRJkZ SvF+iӦV!jq+Ҿ]jEg2 V GBO*"0 XBAˋXR5t,!icXRu,;X,.Ε gԌT R(n"Ȭ}U6M@0db;&A\KpgE^bf!&Z [Ў)Y+4k,s\͹dSRwN<#;T`q4"|.P1EZ FpPu&lhZْH[AWLIzZ$ %vx!*/jܿPR,oIz Ajz҃jy_>AE1.Ts2RKO 3,GFReڟbRnmM1{J;!Z2 Lh,IQ!)%]&#U2V~XT4cO[(f -l#V)[ӫ̿| p a}4n8_،L~SX͸sG@4t=]Ð;)J-(i(Ξ 9yd66lpM&ߎ͜:Y0XbW~ǣbծZmlI&+5yi90p8!BV)Ɠ{kWAee6Q !8`f $!HBKA$C&&4O&H6>p>fY :*jogN2)cc_+[DY"ZC2A$XUi6"ނGLH S,fU-E`L :))fm@D#y.vGP&J-aIsi3 C\"V~x}U{bմd_(+E.3B4H3n@.dR$d3YӶG9Ҏ}l~&bq Mڕ%kL~'l҇~\%s!?WH{Dg Di"k0D<.@iv{M?{Kyju _>Qٿ((7ʽ7[-օ}q(_x]QѢBj&'rE̖{pf 7L }\ n]~z|%y|tdߺ.@۹0p&-,_ }2qA`Qt+x48̍uL5>kJJF h:fã٧Zk/eaw'?Yjl"m\$Íw,hA9:٠SЦ_[5(b!Vy01q.% Sׁ'ppfŸW8_SI8ڭq _#$^_jl+BցͱCGБ8:JYUQ۟2,BMiz^!ԩA  x) 1xrZ#Բ%K}teHIc^@)vy4ΈSIoiiG Rx[Vp! ºYVʗ]iVDeRYCmȵ檉s?:$ ܲ<,o2l(G N+ /G1 b*ki FrF)UF ͟<~XL!ildX Yp0v1Pc$s@>՜%7 PJLg8H-[C*EZũ4՝R2KFxu,Q$myQ0[D'e\'ZZcys5j };Ru~隲=|I<;YF|fG qJ 9ZkWWq5=SXj$$Ж2;m{˯/>ە䩼}[^˭;+tV|c&G*t-2тŬ73)hJ(/b|MUlfc^⭈1p7FAnfIzɌV`ZBHZBǮǙB➦QNb][QQ8=)lq \H4"Pz2(|#!*cS2 cϢQkmMYH> r6dK?:+29O1BtV, {%n!]H{5VEBB;p1$C/9,j|Jyװ- û@쨬/+y拮.^?qp}G~o %кh',xU$2̫띹Q7yzsjtGnYL \G3'BbzIgvFg#\wW旗qmϳ/^_MYfw0`ӳ {3kh ì+;Z13D3Xfrۛ1鮇uTv|mn{V-=Xg42-HX}< Wms>@otZƑ-s\f`7BIub `쬒 Yƕ\Gqifc 8yIiDsRiDB"KXC~P(@joO+3*g:m^] p5ؓox|0{suq.w.y iY:EUM Pq)RL'LU6&)XlQ ܩdt{ SvͱCZ)d%뱧#sXWmR&4qf}w}F)]Wvk21Iwigv~wsWߣz!4!*}іaN +pR5(6QZm8vnzH <{X([۸! h9 #F:>YZzfZlCE<%Z$ѩ4"+I`)խ~ɜ6EN՞Z[i7-ivQ;C[W? fLwoy /\8]`Q~6>_P_^i sXͥ1 t>WƄd8G>;ώX>"! =-.g{?2!Mb㮦gkR]_\DϣBEW9 ʸf&b&MYGK[j=zguZ. 0Y;cN ApXݘs g}x2O󶔲w/v}ulZgS̻o+:B`mHA7$I5K2,qBƅT]Et\HŏDžTA@̌gF"DA;¦gZ{_)KEfȇKd5-K^IG$edUlv:$=Zi(g7sr<} /VXjMgx6i{rŽUf-ӣ@*W3ߏƷTr^ϳY[7j:GUzD+(H) lZF"๔滣̴)b6ٮfX]*VS*Tx>Ge+|M7xs|8DA;*JT*g(Ui+^zKZ|L-㆛Dg(( 0]HXaE@ Hj4 *(7RZҰ!Jfg $U;}[v(EJ+9աUL3b6tEpT3 ڡTj3+NMFtETU{U1bPƨ@WCWF]YU*"ړ⏧tut%2+ WdCW J( :RrJdCW 6UBWW2 JsγQWW0 ]%ʾMpFdDWs ]% ryP 6gIW(Qv$A_qSaJrsF0͹;Z%, _w̮j(eYTtl'_C({V 'V7lWxy;Dt e`^ЕnAWzCa=8`W\*5tP 銚~UFtEUk Jh;]J=ҕ T+)7ubPބQ[lfCW 1ENW ecۉ·g <W\*5tPڡJ#dDWX|_; }+B+T9ѕA#}ZS[%W84xZB>Apk06H1pJM!M#`s0I=0,-<kd9u8'&,Ih~QB)o5{V`7yjלx_vhO5JM4- tuhV d`W\*ռtP6f tu>tř 3lJUB{ZڡTz3+AMڌ+Uk  Rq<]%zPWHWs &uEp%\*tP<ބ\gDWOW{+X.tU`NWR3 Js NpgCW R}҈ΐ r"DB*CYnT}h!~JLp0R\ٱ,GAsMNOe1{NO3+.}<jQH-lx1z͌NeiURP˹ԂVd%f*,LJ(0TwJjfhY;Z뤻ېՃŋuVmwsw>P8HY*凔/xC9ƴW~*?Bm~Ný-V_a#un>|TnufkO$4O/}|su)=i]OnZF_Rx\aؽn<-Sddf?XN,6Md|ݹ/ĺn?^mQ^j:]ŵJ_E(2e62pP2o6hA/}:XU M~j0?ygs#X\.kL0awt e}yz7l_cCd$EY Oq:Zs)D R)=~ 9zow)5/Vn~~>M՟nVΚZU୯-VJMe}K4?2cVyY]ؕkg?MI~rM"E..w >:suy7 w;.)bçɣRt*\|1ּ~um>EO hTԻ6oF)۫)G?Nn!'Fֻ$)O>jh'W rEe~hS*I\~"ry+k# O)dLc/͎gU|(gK\>RJ9sJ2Vy4czUT>`$v6t;Ttl~9]&c[nB煜Lub((@feQߥvPG(y Yhg{Ōx\ }eXd,1B*QƁ8R *JSk+ y("xYÍ6LT|91'jjxB[E|"(Wb=1F vcd*vfٌ e3:kj' *noL3(Z moscSv.}nVX=('c7|D!)Ը&$c'@U  Ƞ2nt&i՗}SW)Kl;BDddI\ŒOKCUʱJ$b\tZK^r#U%c%]%Br];"F:W1۱B v>Gqmbv>ur~g[ =JƇ-WϬۼznnt%уP!De\PA3g"zMެន !N=[PU:Zmt)-#zUt`H"(#b6-;~@B rXpDкRBci0XډP800x&~h~G77ԎVy_̈JdbYYֹvs̖U,:cN; )7^1RBA4*5F!-##Rɒ"N#@Lgp^(FF˸";`б/M)x0:ƫT4]<|'nt'n*/'MBJo/m%8ӕ(O7{I!hϴj|H9J#Ҫ|}zboh؈n;:C:8NлEjX\葢~X?QV|B"M0̍;8 DMI$΋u;w4%H*= *1W(%z)){J d6LI1&w$x.X9*n`Zcȟ0ڦe$]* H<4ٻF#Wd/\3aY]cyIؤ ,%!*)QRj5+UET:O"gX< т LQISbZ z `2']w&x3::H9;Dogd&kHiԔ^JMCY>Zst,6]-q,הHv,-~K|2zfˆ CIдz(}@M!I'_ZAU uX s s G̓8 fƬ#'vz;U08ߍIRD&! [/CCuL6` :c^iiCccƗi3еѷmgP|%w5Rar@%7 +qFf%4mi 39iۣAQ g%(";Qge݀Դ`t lBv9`XtIz&M]\7hzHȸʴP"S4A0 M@GrpR8W>&^UD{Yq" O1G ƌ9R#vN}SYAT9PD, Y"lvJUdGӮ 3#"D`]W}(a޹vhO3 %dRb@P+ L 'D'U_א ޴kǓ١+V[ GNt6C˭sWr9%[#;O?~sL7 4*l΄ߒӱ/B- zկ6,tP! 2e'p2Y;EZ, 萑)t⅜l]Ní҄ME+o.Z+\(-(03ϭZȨ8- &N:Ռ.%\ㄠv.Q7߀m ?O?! \KuV\(/ʁA Lrz孢{{']i C68z0xwX읖޾i |f9fS<-> 놨w8k3Ӭ5:4*5ʺ)oҨ4榀,R7ncܬ~VjFߒAb_r2Gͷ㴘wQz]:Afՠi{,H9L8rU~ "QF 3'I"Lw|\Ǽ7"mEuڔx%N4:Iƕ vZ&(H߽\FghGˬ]v[)eھ;UPˢ/n_[GZrKϟޡZ7۞icZ?%"i->jM^vq<35hMy2x]$=zWz%$ CDmfWP%.2=:N{S:T&ΖY&sl~1^tR4c o7tVt9ń7[&Ya7fs76an~se}\:(kEYjsos,vˬg0LbߐImRnht{et3~zIolnرnȆF6j~u̷7O+[̼4rzk[]ϹmCdJ|9ꚫ70v2||6c5"f!1;@RZL}Nq-d9sٺ#^n0hG(g4䤹s<+H! .8EbZ%uqvGz9%&Nt+ vЈwewI}A0 ;6WeVXx2@Fb`Qigd'L:3Nb=+G<& 1:rpkXhARQORj2 }S:S:aYz8zexZOuhcIhhy' RhgC*@+<P!*jA=o.XġMB=նNf!> Y/ݠSЁ.ø`E jtgL Hr9osVJ$L9+Q&a|՞- 6/VG鸤nIъx-MGy6e pUr Ϡ>\F[1_W{Ҏ]/?U_J;\"uթjG3n p#QҨͧ?%WvCІMҷ%WLFjcë&:q2v ?_aU(%s$]&RƚU o6Y3y tНEbP6aDϫMoW2MpzIv U _gu0Φ{\3%:wg:s7]9eWX {˲YA[lu3gt22 =-}S' >e#hZF9GÁ#W ҆BBr)Bí h&h~F i@v$RIk^~ E!)OJbe*+j5q6+<iCh&+Zmu+z!z^M0WE/*R9Ye'yfRHakҙgjޭKn&c޳N,9iJh""^ m\+R8WVƎX2V]s#{,|B$M 0n-KrNa6 t nB3L AG`#yL!?s4@4sZ P=IY@aTEɶc ꤜ!e,TFjlFl?%PPye=j vmutD1d8 R9 4ђ[O歋^FeMdԔ@,Q,be<&f4W㵉9[R$O8"ӭ(L|r,Ƚ 1)ُ%Pb#aƽ~]PgI),YdI-S n|sc}gKF?}O+}RV}Jvt1 7,:|s[~thjC2 _%Z_8:;h:/twa0/5s1rEL7T/<8_LnD%b{I<t;?kGJ\kubOJǓ4 6[gFZI8hb Fi:Y9/@;`9`vI.ɑ,ϾɕFLVX,σgZ{ʍ_oϢ|x& d/1 Hn3_zYW;4#eXs(aj=KtL&Pfd{`K4NL$np'oԜ[4\/cˡRjݖu3[ss~ HցϨqD<-qVKqKd!1doe7W"s!x;?:j ii/?wF }&9,,uXZx@p O2?:l^z,:1$n׶T[0J|r!T+:):\z3#Yi]cbP2l lɸ1wĹ"B/D,]\G[WPřcׁ^ yvo{גՍT6:y|A#j£Eꓩ3DYSq=WjuGGu}'S[ao}8{n$:9wM ˕'S$Gޫbi\/-ƍ D}5m-KE/hۻV%[ dZm9/ĦmJ mZza>H У ǻYSqmP0al߯ֆ%::ɍ&z'C"rLkJ ) g*p(R} #)B22~,ScI{N9B-@. ci^GzM B$GI1!ȵ1W{1ՍKXJ odN"oͫzK?Yg|eHʾΎ޻]=jwr*~a>a =bͼX?,ޔÆ|g8ËItvlvbtomf?`PvۺJ|Vzhn2vdVY <{ocԆJEt$$1͹'fV '*>Q|uۢ@C?gv"O۟.~b%l<;.]ڑ{} MnqҎ w]S7Z }.~ҧO/6ש5o|sz셗 „@\F._'/a~@zcK'[lΉecN:>AEG^]wwr[2Mjn;9T_K梯+[цGD}ꋣаK;8?4[ay9w/;o^7,~d?@']vDEg gQyOhjmFi6y^)z7q5ܓԷ6d^-q7۽*ۇmYʃ.x\d/0=>6ۢmb6ZV 9{Ĕ69v.O:} 82 / ?U;]Wŭ*$DOSHj)0 L{ 7%_7#39җl{~y8*כ&Mճz \Lfz|StfYYRox6x`sm;\nV=o*vcaODLɔRq*%^ RK%~%I6' T\gJRi}~Jbo8\}wmA4LfWZ~l/~fEy (S[/ï\?7aK<spߠsrurA.G-fE@mb"{ >3moa]kxj3rl/U.ʗrծI

a:A^ TWmde*1ЭU<dl:^MX4+@tsa3l~L6!l;de]յ™r^)kmn0FaQ^hdz4BFYaFFs8npJxKdp\9Ѝ -f$:YYNJpw'T@=փS!@o%Wd造gOR"l7)Da3? +E[s'w $ow>f+oV0 .̾T]Vicd|>G)Gp0RDTSXD ( 䱮)1Jl: RiuV㧑@pYPAdڨ@¤%뉊Y:3TEwqY+>`MUG@u3I:xjZr Ԣӥu38v~U|zY!5! um:`5;n0 Mov ƛGAXV}XPGڨ@UءXש`Qv8gbq R(nl4wd8u4kSQlMkU6-YIo7ሤA~u>v_\x p4ʯGwW,޹.Fh ]5xԋB{16$l:#–U ءBU{rgCֿ?N~_9MW8~וRLE68ͦcO[c~EU9NI j7$JRk6I $RNJZPȫ$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@MѮSc%|$|QaI PےZb(b{% $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@Zn(aUqJ%=\>I Zd$A,> % $$-I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$b@8 O3J!0 ZXI 7:KhAr/ZbȺA@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I$>YZ-v|4?xAKMtq͂O}ެ?jWW{@,Rs .!pL|KwlKz2: .-!Q{HWb+}j<6uEItD]Q,#]ѺQpX8]|(j F%S,]mt(']-HWhIK5fQUOGwx4|S+ZqudjqVܜfT)GG?js,B>RTSN{o?Z]_'~/fx[<^+fuj,_Vx[:jWY~~ZoTk,׶<-Hɮ|w><|ڻLws9[q$.Z ٻw)۪v3)|sa|Wou4`zS9 wU}|78=)+yk#>16Xz#y U4t`+EWDk/FWcÉR]WYNpacYIytGicY ]EI2ltEs>" ZLp`#]6B\hm+,m0(:lHx3ӻEWHkzbp]!At@]h׌tEXR|zW.BZ7b,Jj򑞥HWf+ i?DJHUt]Mv+çwE6Ƶ*&kltE](֫uEƈ+P!r(F+'HjD~0n( ~v`N=ɼ>x۟5ou/꼿=/UKJqnhU&8o>9`$Ѡ.maP\ *i8)YFJ1JW䶔;ɧ**@5z3M^l4H)SʻL5h VSrSU;t.:{3)V>IܪK7K+ 2ʺ%΀[u'M"J#6]_v*zNgDBp3?2w[uaLTv潢=3c2t?77(."ڧ+ž": @pjeӻBhB"JDW ԕSV9NA֚D."ZJQF#Z 8 8xFW]-u*h0GFW vBuE֋c+F"J:EWDt]%ȝE wbTYթ'!ɇ{7h8Cp *7ZEDe^5tTKe'2VB' ʨ!Nm-ϽA&.!-W!ї^(c}<`7>̫<4,J_bI+-zhk!)F&p+s(K[,):xGy!r&(]WH {ȥ0 +lzWDk]eiDWѕNsW++-_WDiej69銀c`+Mv Z(dCt]bɹ,Cx,]+2wD]Egg0Hf!KQ:]-RWS#2nВq޴qawbqR6k:&qwNq2m<>5:h}ar2 $?U auWjkP> W^kZ1<~x~]#z.8ibp[&7@Z J#Z81+>KW5]-.1QzYXDyW<8]<4]Zf,vdʈZLtE^EWDS"DWѕ!HW ɰB+㣚+GijL+zltENqц{WD9ibEWѕ ENAʳjEWDkt]%A3'`+6S>xu5R "1|tE.pц{WDdjstQY@Mh*]WD 2wH]Q̏2V:suiuVfw6{&yE%poʋ}6ǦѰY}D>QzY}>D3*gHD#(S]}1;=Q1i5<`7<\?<8S%2wt]uEN+]!=hu/]S?Ea]uewy_]prltE` ir('trtŐ銀# &EWD \ JEuӁWWMn\tEx]!(Zh9銀F6sWD t]!QZt@]E0v(lo SiMcͩ 8p1sp8 8FӈA0ε[M/QӴwoϾlv`xbn D5)Uedbv˪&u,7*8H)I]v N2+L"\F)D[:t:(ޝ(|ajᚙoѺÔG Tqry#]p6"ܹʢO]"J+Z_wyѳ+ &+% S#]!WwP:S{ut 銀cp+ūt]]-QWި8 3wE}D^"KQF]-PWX殂qJ;6"\chzqW*g+FW5]msWDj;}BZa!>])8R!?I@7h Z܎鼒,lВQy Dt 9eh}񫏈2Ȅ` tNOx]/FW~Ǣ'ʺTwY^,;hLQ<K ]yC^X銀].X.Bhl"JkEW ԕ O* 8FW뀋vGyы+댉*FW5]\ɹ]QzW!CWAtТ!D,]n\tEJRd rte>3v_]!QWLTOuE-3]FWɨx^Q_gg}w\S믎/߭O?Z8f*W']A2C՘uaR'x}KO^Xgx G=i7;^.w_`ZXVFMm@ARLڻvlgW/{}.yjn%lss[b(ݾg?l g_W5wdɏuvR?v?L< JqktU:Zy=^3Z:_>զha_KTUVOWϺGjn´m+P1vU?Uݪ}/o+V`u?mw/\_G?賧-J$gmM ~GBg7q;\rYAa9<}ۑX>8b}?s\k\:|?`K]l%Y mbM6M\o?)u|jϯ8M8X]VylZ&FoE61iW!-] "/>XOW@tX\8V(aƫ\\T3֊^mʷƺе!67AlcNag|*uRgfٻ6$W܈jd76E`F?-)R!)2ç$RDcNJ83ꪎڟK5H>&CZkȆ(LSh+E&k5 )^u.#5Xǧ_H)YMOzjq'AIz(uhT\3?'=qBW0j懴ks?].>6._fw~ln:/^f,̩֩Alm[s?!>o~Aٮƞnn@nfX&D+XVhppXɪizګ`w^HVs`<GJǚ/qs|(WEl,FyQmhM:k?ݮb$(?߼zϟos_o~|EϴS GG G V}]k.ik2v=ߊZ|W|0}ߟ]z=:=P& "˴_OTWhn#qĶ՘L"_nKo45"Ļ1Yʑ٪m6[(pe_ևwGăl$6oFRz2cgvs0@Ed @ףbYrg#=a!\{>aR|;q qa8_{:μm2V1JbbAIdh, nYb)D"{3 ݞN+{:=|hMP?Ck4qv2OP `Ş1;;3<#a#tHhzDhe ? PHZOOCbժQ$LwDŸȺp2})-8OFߴ?:ZvOZ^ŇVMW 9YI\#FJ=q6) Mh| ;^Ԁ& }f6?EFY4fi[6L3và-?lGr" yDGRw{8a6>N8myp0w#c急]#H&vnTIu>lڜ߰mG,/1_&? \s^CФ' ev}R.- j ^A6i'UATR)tj3Ϧ&кV>i@~*:S Y+B|53Ȃrz_ ާGEC?zfA;}ƻQiLs`,Y<FpP+A`ЪIBRjX]t:Kx$u,,BOXc̢eSkùtS TuJwNeM~(n͕;?χsa{B6Hh A-yǶϩ'٩q}TL?$eKmSAKvIеFB zo=a|sŞ^ci!)%K\b匁OrQ(\)D]QB{-VX dQ(d2FIJX,9>dTd.d#k ̵[Ύ;c}6}: 8ܽqb>n&[clZkT왹{nwyz:Ի?FIdw1e"X]4&YQq^耣=Q [lƨî,,c"(g^n0-::1YiuA$;iLH-ۄ{'ȄkŞȄk,훐 A4^dt4<2-xec.ߚӅTbo A'9A>2ւ(pمd20^ a27<6\.r.5Z1wi?P. cƶc{X[`u3 Y1^}!j` J[ W/%o5.7W:Rn^tRSj)S'** dT{jo!Vg!4exwϰaeDtĔdVf %!;.)y' eĘ7E:mєLcLƗ Bp`@cYVrEb;0 ~C:,Z9OX6«8dFU67Tr,mt6XJL\axoDB%#WN挴w. T}Ya0^a3Ib}lJFn+Y:cp-M^^:VHaDȡIgȊZǤQ42 Z0&dɖgC<.aJ%MܣI6E'5SI BlE0% ˙D D?;$I{nHP).F3U. ^XXr`2lAWh69$NK$cY %d 0OPI8)EI=D%R%[6 ԛQ@3ym!DR%:2]bl>;}։s;B|ucym$XHJ^FO.*n -YFct JK}xxlro{^TpfZV ܁/B 40mђ%E9ZI5kiLɴn@},&ʞc&܀Ă Y 0NYp7 5iC1M.g Ke``hϵX7qIPc9i`,M@wrpBURkFW 8@hAFǔƌ 9R{H iNB262ȁ&:f!f ea1$l4JD*[I 49yl`]1S0ar9yN3 $dRޥZI 2B'>!jkVJL_s6O%X598A_.L{!V?Y!k/j}1{h=$P3P+Rp&,N.cjB34l-mԣg~aXIw0YC =SiGl[))%+;qlɾhh/|f^q.ip4$* /6crqLv*ayUv>}$~Z5İ>Ί똄/sʁ%NtnJ<p¼sTϦڷx_@' F5{d|7/d<}Ð'K놠wXSr"R(2VޕB)E C3nclnYosF?=·qֻEtO`(k!LC|@yd*$p!㞓V -& $6` 6V 0ms^qri7y9"7_VI)R#V1uT]చsEa^FH8@i)H߽grţZǦ2wrOBrw_v_n,+J(*>ۭrT/k-={o5E{ jVz!e5m"ͨ@{^#W%,:,EįHMX_}ΔУ)0J`ޕ6$׿BӌwwF h3 u{{ǃF!䒔5 wGV% EQ)U}Ȫ/ܙ BDNJxە)taƽp n}Yڨ~w3_n]om˞;|百gjYJQ!^7U_.]8Dee\BgVXR/8 pI*Q oih12mˇEX.Hzk#+(EX D@V`OQ0:FoCDB0:Q&g F:{)&Άy4=v)ոl&)W}85 䘐ޯ~4pQ.[E$!" OӪY}\ Φ~i\gw~N@GY 3yz+ ^WIV"DA9Em=_ޒYR'7K.Jqo*Zfc:ۯz:z){N_ GϣfvfLz=Akf=!pdjӮf훰 f^^WܬyG|,?%>2qo܍3AҪ^b殈ȝo,+擳˲koTx$E|V|Vԝidx)u|N?J+%hrJDE*$2>ӁU_0ez:JKYH1B0[D%62br>A%#DM:(".jG I o7(P  EII8@`FHSĔ tERG.FLQ^~3dtInE[yjqeߦX#0!檥D #5DHV;/h+Q<8܊23OC=+C׆ Y/lS6ø4!@Gr`K6F@ɅP*cNILrX޿\w~; p~_̾w2^w~|~;/'͵?9;i M/yag.h\]WY[[Ju2^jYJE2 &nq3>"O΋=iCٶ xVhQpŢuYj*>hq>vMh5EXMc[ڃ6||i+LJ~c˫&76r<4( 6?_PJ {&S=YVS1`I6t/tx?6 T~e,N'BmG &%ˀ h0X57ftt".2?{ttEvEjl!+nF8v]Oc=mQ6)C{ab'N%> Q;f`~_$9uz$hIH aD D,`D!R*Ӝ)aq8eȁl" *z+77 EJ[8BaN9K"5Чj}zZL7zjVs'{U}Q;O2_(#ZhrmhRsv\d9{Uy%rIWT Jwo.5Z}KPۗ|DrądBÙ68åid& Bp4c[C, .* @mG˳=v*mxR(PpIEs-{A 4_f{$FAp~AZ@2(R4+D$A0E_TRB D8k5 A^L?IDVc,RHf*)@H%!ǂEKIXȤ(&XLǑW)Wflm.c1,67'D\@/23^q\ܰ$j p82  &+60kAYG  N ȧUj6 P`Bbȅ"JB?(&)MyЂ3.8"O"x4[M #b1qG/C\-鬳l0.;\\qPBqCm0!!9`p3f:d,U#)Jpq'7s)Ua< `ëx  _(!G]E?.͢;{k(5W4U HI* ҡ-w:pnM7.P6ޝRoM>ܗAv֍^UGUZ^5.5j?Elg.ފ\Q?O#0Nw._hlgs4>O\~In]PIinT=QZ2E8JVft#vVo>Ĺ&G =?%8p'Xvm޾zlahnmaA56TcCv7:Ow^3_}VŃZ\5GIt4k_~5f.6z8N^zZT$^?qY!\xڿJ^'m|%<8(D rl唐dԼ)@9&tf>t압OL:xL;KrKqNEJ$`g+ e4Hɕ:k9 4q`/ @ @#!2*%3jpܮ~^.і`Wvk\82m\7bJ-q+7Ι{!g(uqvF~@Xƹ4uYP B U[[W;'SGf=|Szc+Yʳ | sQP O>:2SQ%)N1ݿG (V9T+M_eo 4NSqeE%MAH P Bw* g"[ޱ-e)O@C~t r+JJ2b ;K<|h PcιFO9ŎwPczym[٠w9jie;Ճ9@4<2Do,$+\h*4q'Gd.':+N]*kܰ,WkY+07aatIٞ^|/a`( Ks= mB[ǺZ. ƺiZ+ |/WY\CPZ r dtp T׏~.Gm/U媍:&k&٬Yaǒ#x x7>3b ;fDPnjZ̻2I콙7z?x bޡz4ĺ=Ko<+ÓMv?SM,^Td`KCF>kR\`758lP_o,ZxSOLn`ꝫʠUY Vn? k8DY\nadiw\JQamP`UC,dWYJ%:z5p7|1S ĕ/ WOՓdRJwpF1PWxUWY\v0pݵ|R +TOcguy[7YVKX |#O7a70߭USt2.*w T9潥^pHmÙ.qϭ&dOW#x<}qktR״0!BjmBſ5ֱ5?C["Z+1$NF&Bi 56D c -Ftx2ޕ`,"wei5+Kt{ٻ81OZqڿߪggd[+Tʿowbdl$q`P+;LRn^#Abիw;t, @\ 6'L=g>;~|ϛN(HN;ӧTC*INJ3r8h-<#$*.*G89ʲD pM . ZS(c$rqs.AE`jhIl]Jkg_IA(P9.!~p%?o4i@-ײ5AmL%(2.jM4)c$ב2!(Q. `Z(7p?Xe"KAV< e%4H єjpH:t=KBĽ1TZ2c<7)e4G΁$hds yAiϤWm#,D-&&vE?uRBBCਔΆdObyB)5,J$I*)ս\.%SQZ$$6pGA2|(5*iCjHǤ<%ݾ/s] ͛_awo77io߿շ6=\ ?a%e~}ڍ#ɏЫ [rG{f0u8zU,cty#.AC &h]LŕV n>]G0El(N[Kz!ԁh 7 e~1ƥRau?L\Xvgw+H9|m.,:M1n!--7++\FnM_pAeu_nmޠΝgk5z[{G9,]T?UO/Y\Tv#%+`F`l]s7Y>!woEHwT߹W5-Úw.,h#'%̧d<[|`4Xۜ2&Vmګ⚬:y氤DdV ̡7I6&U"z8{&U&ǽ0^#:ww_%eݟ]-J? , PhA'ӓ gޯZd9 6Z%8zӠp.5Q,Ya|Gy*oj'W=}>&e#WP=5HuiM$e3vS!"6MMrV o H*AΆ%s)J/P ŻS%<*x(M0F JG"JqF;(DÙ0^R;Vt永Kpm4g' }cvMR;Ԫ~ԞQ%ٚ&pc˻OAl tJW=ky y ɟcoG .b^um| >fK#pާk8.KO#Em&P&>bґ̒V3jh}}{hMg.7[Rv| wͧ}Uh4oO8etgDFIJ8 bBeԬf!{JD,KQ\8Vg@}2jTdPܙܴ)5pSN=+dHM!eg>{lҨcWt,`FnY"gwWcZ0!e6@P 2_ d^wȉ;yLHۄgdBJv>&2L>*^Uڸ zZKo$A0[N}zCl[ku@2D'#B8K{E锨G#,Jrk˂H]m1+bx2,<-Y+Zمf~^l=?p6C}eT=AjG+*ԞկP{֊s/.%Tt'SMcxt&gw>'Sz %3xC->=rC%ՉL]rgϵU(BEPpjvhb9nkѨ $kD,ѺoAԁ*=;)'!oRa2{S^>$&& ,m祸YX ,ng0}~? 27ӛ 1@40^8t!d%B(Fs<)&[p> $m/?Ç^},UKok8(s=c6rŬ㔹f(_vRIv-dek!X+-h0B%%%]Gު5EI1]YU@ =2UG9 T5uJB0"pp?q <?%tb=57W93 f [UnB~ra1ݕSߜB7Jon9wD-2@gβY4Q8Gv&1̞:z~^IM衻;2g9ţq]wʆAhÃyhN'ip/xi]jϹs|ceBFGzghV!.^s :OK+E/2sݩF 4P&6p.BQ+RHzWQ1V^9,nO~c]ڭƖKN5{4E^Oo-gϫ^q=C)B?'u_(#Z+ۯ=3ƢgdAׅ x,621҄i̞wn0]_*TLS y')@ kD$@} | S2\T؜Ki Aڵ:(BTm}29X[Bj}RxTv9fzF7gkny/73PL9_ţBS ,pP~txO'Bq5QI-`u %* nx  I 8Z @EC +d}E-J4G,υ* QB!~z>[%0ldXPj́+>Fj5Zb:I1ށ/iSE]7H|^E/͒5%cI Ὕ q6߷eNWW5>7juKa\%nե7yۯ]._ky6|np +zɪa~quu >;П{..|O &?ɲJ%~ 1Mگѫ6?LX'y6ވa~qy>Y1{s͜^,Vam'Zʩ43)@ {Fpž(5U=eyİ#`7˖냀2f;T lJd#H\}Qh(c$]$vCJ/}"@VERCRB$ Dm)k\BP&l6"]Xa NAY2e>xBD$)kk3"Ϝ.GBhl ^h$u46,\I-ݨTȐ  cf6*eye4C˰ChG%$>z?moa!-eIB^'uqxC(-ƴQbyu;tl†5e7|ma{?v:bl ELH H@#Z*oxdYF~4&ƻbovه_=Moqk@෭ͭ,wp෣s$Ƣq0&w3~ŧinz1]^=`Zh'icg·L]@9)c<Ĺپg|PrC0OL.gRުH~qZ:eޕϵ@ȵiE4I _ =R싞b֙}CB|R\TҴ}vE97"_(WNi1>Iy_4GM.:a~!>drn~= _4 rl~?ͶDT5[~@cm#X_W,*c:7n ToZXz9nk d;2#1-lsrʧ,B^a7].Onf,P.K;Chu}zLW$#júWsX- P (augdeB6VV'QC6$< 9GN)СnTnsRB/%G#u) iT$rfl gAlZq8; C 30g2tZ-T$ʐ!`AYh|X2 Iաmb`` C2DRn\SD(#%kN쌢31DŽ@Q)'ITL726g? ud\\t+TVZ/.ƸG\qNHKv^D(/4: +"Rq 8Ϲv싇~xx⢼ȝ~|=i}Ipc㑲ٷ's]0 {dZFȧRJ{(o)RTcKRtVc<:$GΧwԢ C1v`"k0c.$oP8b5eH'(}I4*uBrm7WƑݿZ3)v5?ŧ¾x# zǗьU^c"{zgmGgHr}ݳ<[)Kyq֢d ެ1v>Pb4RYzcKP8{ҞfJL*jYar]p ǎLl9XrY`v1JL&eTZې% 6YAH2JTNվXֵqs?;dn 3 ] H<6YPK-)(?κku]Uw4JWA0l?1`:%u1wjO2e/jKDcTR#*K̔b1j]\ Zb/N]{B6%K(:8-ҘC?7Vpv7׳6BjDwH#$ e5Qdm5NDY )} 3X|49:vAHc&V6kzV&7w74jz8]Wj/;l[ӛ>˲N~wuyJnU`L[SuzzK)s^`[v-u^i Mn҆*wɏ็2.L1N=Dɿ 3?/ǘy>*WNډ'1,_ݽsh*0>',~0]Tb(pW9E?ExƃH5,/jI{9&49 iY'ʷCh %c}.響y0y-ך%~JROs,>R5;dqvo<~+G`!T[_+K]뤋6%_Tt\N Hc=?.btڹZݮ}Z׆KG!#SepRPF.cx1<{ =efvhq0k`ֵO?gźU'OZ%HΣ.6Z-nܼ~C g^;d-@d"SyKBbtŒ"萋MknrO d׶2 XR(0,R4RǜBHE"gH2b:xE4gG^{g *I$@'$)b9?a7Axgq`7Cɼe2Xqd#grקؒlۺX>v窦_U^H .+J eX6hMJ Ȗ vÿ'!~^͊~9馏^g=EIvOY&A79k?# 2)!$eU1&%,X:Eٗo|Y?g?3r~,wV`Z,};}p}sO~aj| \Yf*bmmȟYJVCOI1.6RBe8juPGw?0ayyIO\mVTf.Wj ov:?ͷ5F}ErpH!ҕ^NQ]K;!.t5VfuvuStY_dx܌ڟV_޿Gs+xB\RQR22$$ r0*QJQ@#""T]߾> 0dX>al ᠐!bbHB`BYR06T ZJhW1GP^}_٧:ݎ$ !RQdM҄E}@Mb))Ӊ)v }EMŜ~tENX2_|1[e/jbä`Z_YF .~Ϲ?JWgs*S?׃򗳟ݥ"EW~WfIg?&fwj1iN?y`byMm?wV_YӴVo>Ͷږ } kkLM'<{miMJpMN~-ݔڞpSuM5S+SsUj EC Nw|l-\}{9r˻&w6u"z|{FGBX H W8>wRΔ'vI3Tsf4cWIT\9.s6+KR9*K__nr2`L_X 5i{t/k4xܯ^|a֧l6GCv<qv2?ٳΞ_d}SzyO֝- "1FhN|Fki[Yf*d46d2߈Nk1zA.j4 /_fP[x^O=9HJC8tHZd"Eq2<ʨ GQzM@j`*-o~;]7!mqdHۣf!]^UOF y;>c]s :ݻZx}(,*ExM90[?ʏWʹbsbOzԛ-~||(Nt} >@Eaj'\1tHUP6NLXEpo>rXv6 *YG#fsS./B2:IB-/- Bd'vQjU_!YIPŰyv)~%DHHZ`Ѝ8GYoW3l9Qz*l:J"pc15M R*ū{o ` v4&eHk>XmЌ1;09.% (S?RB82#s/OY#HQ~'3 )[R z%% ƖtE`Myg[QY;GV&B+81a;"A)a|2M",R垘 %0%J`c,$)`1ND5l7sjPh|Z5>)ty"9$^Rw\;km*U"C d|!rHEy4vqKn :iȩ!6hC~ _NȗD͒$$qoSyɽzɎ ; 5% !2VMJ6bJCKIb8))'!RRFLTIY[*'lMR9&kI8Iń)OMFu`g ᭖/̊g`UH e%fizൡ[՞f71;&rM1 }p0DJ a)1z% ]t>+mbҵQo!<3 5dAdڧk%Α!R!Y |pKDzuFȊ hAfKIY^IXQ,*L6˷4?6ymX%hkR |=? ^7%w hXRlWbtWi\dcJO4N.风ۖ#lt9#dcsJ>tsC]lp1B:0( ("Ŝ%D_: an]to[*^ۑ5D~.xtkɊdhy$@w3g<0K]ٻ6$W8 #ukuu ЯO#)UDڴ ~jiEx$CcُxGBkݝ+:ƾʃ|urz-z/v`rrq-Zl*gVyäp٧He0vǛ-ɞk'vh!m! >]\0+49-y:%1\R{ Q,3>agpWׯtׯ~O޺LCѣ~9]?F`dB?iVFq~qjΡ4fv F2IVZ| du85qm諓)<  d{z$ҵM+tv:nowߧ0-pEn 3kIۯYM~2[W4o`>Qt+3 S{B To~Iң*٫ޛh㣊^&ˣj=|P+ 5μFj^ R& aJR:ZJj^K%Q%#6`:/KMd0WBPNgN6h(),4LBL+x"@`G&HN ڨKMdȨ y>Օ8w+!XSCH76vϼ_w%?>&fs+b?]#Xj sB 87Rs4Hϓ.70!Bf}msu}e* зIpw7$K0ʐ c h dd$J =`SPJB +%+5Yf- `1dZ6L:g;:$ ]Yw 2eC⹡G_gH1_b*{98Wle R]A&/<~GCB;Ydk0`<= U'2%`תNP _r@8t֊3|;^dʦ%z tsH̵RQ^X.Ҙ^~.VyjR+= g;0?߉lg)XIEVi 'B>Y/7: |>~w@]JW鶧oo^Rq?.o8:]Qt=f`Sr]_(v5 y*jp׎jώZ\`jGmtw(Ӈ1iXP3Q*JI1s0`]ԎJiyfVbo6š*>O[jeZk]ӹWQJ0:;#]4TBfȜF Zz DΤ+:L|Dr`۞,(1J rd H'3ZEC+h!$-TYFQNw8->J MSe7/ֻ(p M` ۲1 kcbdP+cS2 A_D׃p+}!0*8Pِ-HC%Ǽ+X`+u*/[3ޅ]_[TY؉Rŷǭ[ . #6ǥ LjS,7AKpĭќ?P^?Nw?+yb=q9#)(q(Gтx.|hMVخlTIs bpiZ6tcKsKmfP6,h46FbAz>eYhm[wնjaز㲽k4H}~By8U\q GP U文j҉NL/WwyF2G?OoO)p<}7~O/4)գEQryd7vـlwt7Ae/,\_B~ zN׼y(/<ދ"^>lb_!i]LU=m4@v9-v-_ A/H;>h^dy~Ϗ }( [u4q >HҕjG!arEb %wQ(JzσdW31so6̍Kݬ/'u)2IKdM .c A!I>K?8M.wo3x .9>CAɍNT ꐶ2i;hi5>K^XRNq )RYOF._l4 uRX[dSrc_`7bv:O1s߷#k)U7gT-SUGPCէ4&t_ԠZ6 rc7zO柃Ӡ7!l4CT|c?;g ZbӉFHbnuH2 `Z?=^rRm\> h''5Gty^L?K{PPd;O3sé.q- c)yJZ%ISiUEVRpˎr Ωz`Vknx=&ͼl/u ߘl—_.o<@>`t_Ʌ3Kt|B'ȹ&ē.dª٘ ށ?oW"! =-З}T uq&ğtv;{]\;۞G#! 0r`qQE͜M*L ˃ Vtjt *{d\a*8p'ǜ*hp%1 L+x# h8䶔ŵ]_Jgm)UCdB%Y%Vޔ"q})U/UPJ,Bh=onDp<຃/ )pIr^{T!s8vfKL١/wgKa/23 *x6 BZca 4!s* XVD4^qz&b@EA'P0LLPuKto}NCVqIx3qPOljwU=-!0PNZug{ʽ%^0ss_&QxM+&.j/uͮ$f^V 쾹*K7檈"ns +}2W[ޘ"؛EZ n(4WJplUao+sUH\}J >]3ý0WE\7护Ew\)a4Wǘ"-R R")W^_Wl~1F Vț4dTYwէ^_zL.j$Mܠztv oJb U-; cK3d8ZM1ZX+Rk˓OJZ-t7niߡeQV Fčے֖t{Vwi pWT#|61/ Vܡ1"Si]!YJy{ K:M!0mv_꼍煇[&{7=D j#5mr\bD95hѩ‡[+U@pB%DSZe{^Of"q&釚i i)]G\I:oUy.tg1 YG+foH8SGBKT)b9EB1zcw^&h!|W\}!rS څilP\P'̓r7J+-(uҚ\JzRY> 2nDk2epVROJKCZ!axqq5 HզK9+B32\ PRHHҚ$.;6"jh3H cf>'D'AXBwi.=B- J~QOeZX+݆Kl)TLH.Hx$`q>Ta-"CS8D3l6NM}*0~0 y[FQY@}Firw!wA%L9ܓ)Kto)@&i\neU0zh]p)KJ iȲd8F=BmNk_qإ=&i-zD~1JC4tbY.PCvYh0Z xIvUXFЅӑ iK@!x61w2mK&I򗞮1m¢J] +'149rBQ6D؂1֢u9Q&lEjGLN J'}Pw.uH+ mG⑌bDnƫPVXket79IN3TfUnv(m 6[3t0t"dB[ܝ6ق"&_pU?76(A `FBZ0r6|yMt{9Lܟ1i^޾>jnd y(gnaΏBZyLۈ ɘ +ܼ]n i\EQSp YJ`[P\( zzb#dsDa>PƱBC V -Ɏm:yrHx#Wͩ74A2Z8Xd9)*j 2Q@%WS^:kB-r MRS\mvd3e_Pc )LO[ ]d2Z5!+ɚ6d5[9)ّI*oBAW\bDeL"ί;6NOe- xwx::҆ _5H(:Z3eÐe EO9<ЃVNօi p#(XJFajQ B!g6UrقLfD 0\KEt,&Af=J* eB xq-@B֤r>LR躑duţc"9?3!qV/oO{qf Tu@i6yӖ_QgoDJӀF;Ѹ"1K>`l987o3g~nErON ǎ89tKhϴJV!@#H@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': t\'ˤ[ON s э6 J t@'j'T@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N ,2ԑHqYۋHh $d tD': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@u!& qyFq׍˂@N#:  N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@:[>_l5>m7׿Psijwϯ=D'K5c\@KĿ_}“}Wd}۳hr$Op`ʄoorG5+㑂ܘt G(7w?e 4iw/_Ԝ_Ncͭϯ5:}h= ?W1@o^|ʮ>Dr5z_ڵNC5HY7GKx?0ѳ^OB nH[j)&77mh9;d,[n9"/)ptCpkKB18gב\q Q7r%6˕PF+~`LvsZBX&VўT*W(+^ WrئJ=w#WL/r%[(SR:\ɕSF_Ӌ\ m{+ճ݄ԑ\ 0n ^Jhmڻ\ brE)s,%W++LYr8n ^JhJ(I#UFHlF^Jhٻ\ NRR\Gr`Y\M1( BXrŲ1hV0XF&Ѝ9E{ъ6YvdEAݾ>>|i+Chxx;>̃+wo_ֻE~|Ӟ ]B?GVk֠G ޠ0O$Gyjx(0;1+iOj)m/b#c1Kg3x@#ǁ6r8Z6j>e->6gZ~āR3CZ-F8>!d9宣L\:ܷ~ۧWgYQ7Z:cF:+fFMEKtJU"'sm%yc2lu,,XHo2Ϫ\=-z+{+u܋\ -^2dʕ㘝H8#Wh˕Pru@)$WΦ\GȕJ(cT:\3w%)"WBqr\:\LU49\Kɮ6J(]Qa=UlmFw&w3Ձ䊓;;iwWmL&$/b<52>jZ{\"+̇lS2seuGuŷΛ2GQcz?ɲذ\-[,协ZDkFG/]ȕWzl[)z+{뻑+:ZFUv2[ʕcϮ'd+ԋ\9&cJ(mR:\y)RGrR7r%bPh lXrEy%sdz+p`Arv$WvpqZr%z+J:\%{*|?ٕR7sWBJ(\Qݙsg+coo.=?w6쩧"X7]u.E˴Pe:g;~K?M#OoA4Hf̮o;[e=;lQsrڜ;֓wU)L{R@ չﯧJ6=].hYv,F; QڝmrE*Wmzk3ƃ%l\?r%[rK<]2ʕsrQGr-pc7r%.WL*W+9FzrL7r%m6yr%{NY\ '`p5ȕnu2ʠruD ޚܕfE@AIr`jRO+.w#WKBJ(sV:\%le"W]ǩruƐݱ=}5S1h[6};\|֞ Zt^>7aUצSo?ՋOv䚦436anaF1\cOM~6fz|?a<S[ö7ڌz .|hFs@Eu-WFs}r1iuVY3ݢW̖?.p}^(qx~tp1iTR?:xϬqD4^`5u*lTMC ROe,$RHФhsLP,KVst=s?_UI(ڠX%NyuXEW.g} BR"d"T*f]HY&&s%DNZD2׉3Sb~~[زYX QJۜX9R+2;lI޶޼yb2!qsa_o{ʅdnY(Gsmܫ5|n);:,p>%zٴL񍙍dC_?oC_CxXc$^Ok"Bk8UԊ ;wC%ݷ!7V3)R&;4|Li2VP9ׂGYdD{DxD(dz`x-'⌓85g<~2$ s'O.'_7M9)0p41%ES)'k%yJ/ X3!3rIHG@ wFw5YnQ-2nwo vPۻ)Nݭ/eP˖L.iF xB$]RXfHƉ*;72RhzxiꡧzzڐGXt2"E2xNphQsPFF2g@h7yAbC?YDi3#W ^6E'i{tR`φDyr{XX.ԣP1ZK}gR B"MхPzQԵg7 wH ?Ȏ.@.FUOG0ԋ͎%FX!kZ1o.ebT;cwOJdK>$IpV$uIlf9 u' .9Edrb 3M? ͧͭ\$ط(^pnL/8#lp2^8=۰vǓa!77o sgVQ46i X!vJc2k|;Y/ἳ\;p`?O{ jTg죙U9I/RgD Nͤ~vl:pMܬTq+"ndJݘh?59rmɍ-mґ~RiJ'3D6wܷ:MbnOC>ѤKPxeNOlx&i٦Td*>1 Kmohmwj6zg˥Y 8Ua4hv-[swtFmwcQGM\tQ1\;|aŃyPŖ\":G?~,gnJdxY4균Y):4F;yP:C)aT.!nPs.}`t}`Y7˺h멃WhN g ,H3W #@=洱_dcSKRЫ*u_UVl8 '1j-@VjTlPJLX*Axl/e?2\} 3rc %$2k=Đ=''R^$ `!1FcBfYǘ *`J> 1" $#'UXc3r PP-Fe򖭿B(VSV_G zzGKV⒴t_nj ї;*Jqѽηei1xp 4,f^$TRXlOS9p$p!'<2*h)AK[wbUI}E:c=OnOCzT6#kԋGw6ƪ8=:Cޱtb0.O 숚TM/'ȗk %JHs)7 Ug'neÖJG+))FJ+T{ϜϓmKbJ^ OᤤE˫HI%rIOR$d;͹d#k)DJ5 rA}vjd'P+&ydZ/o}&9y )9IȹN㋚tCr5Մ <FR(xr2[FAG) !g,2yNn?LU0 \eiՔ G RjrPho ,xN%յO9M$ŽrII%d 0 HPIО$IJQh$QZbbwi26[JYzlűN3.A9Dp–z.%'ٓ Y'NtXT-W cT$dLHF"ȈȄ CuBCe4^9cNAiib/c$c3zo== Y[%wK0>O8Pq4ӄea6lQ%E6c%sU.\I39': ÀXC I˫f(2vʖqRv)U1frܳlaɣ LC݅6hzHȹZ(j,6MEC]tA*R& թ8JAF`c:^:ƌ .s^Rr9Ҝ6L9 MB@bt.7dbʖh*H?MN@1)pg<v+N;iFZtY,wA))i%:d V[ER_''+r^?cbg rhy_In34n g )<0sQ;V_l9@Cgӊm[޿y&F{k M-ktr)5 zn3~5a' ǤLH#bS+"/ɋ.RȋUgVI)CCU !(},1F|!փOb?dS~WqgjS^% <2# hbh$ST1;S_n .zm]۾ϵ-.Mub)6ʅ|qjTJAy@/|:`N}@=a_s4\&AAz_?''O BJvt[tcrO:}xjkE[ᖢTcLcYZ$t++VouZأAدNGsҁե3*WUIKg#鿢oám`q79`f L %%;Wlrܒ,St("U :& -B9hzM:GbIFc*+epI%93*1nt̩Zu ^Yͳƨ18`:,7k [ }h Cׁz)mC}}:r"g2a8q rDĽ0(;zc1Ji^ڠFS mv:3F0Z#}B1HQ8dص$YܑHAqR!5TEf)99!JsNz)L*LjTgG\1 7հ#ς,EWJ` pk2k9x81vUwSgO165a8.?>\>L5mJU\.Pe G=-i'8kʧחH3?#q< xQB1zL\r4$o}ܬ,)e߃_ ̄?Bb2܁?,tj:%0 !|nνZh}$teq~} g~6RL=Z$@"C Ib RBb8uPv/9{PQ'lu^\`ܳ!-8lٜfڏ~X;'E}F}S5Flήn/'.7=ٷ.o7iV5.mpG;4қ4^ZJ>ڱ{sHiwrh M$]s tih7m=bEW7tu{f~i8-^~gh ҕ ],uf7 W-]E<-߻r>\M&[~qwϻ<~ޙO4<_ps7G]z#uk^2U.o:4`j)Åu(~~J6{|O?j̎u %3MR6*mɆX)pߧN+g`q8l |Ț'gB6KIa]tILˊ+dpOwm&oi#ggv|Nz &0"ܧ(kKT&DWhQ-=:QQ1jf@v$RI׼/GT&RKA`-XyVgbt)`)^=;CjzgFzvs=y@Ñ[f*!3~``tٕB5!р)X-5)KN3QaڦH^[L\+Hն١Q5Rxm-k5mAmn $/wt@Ooq6vƏF/t[l&T&䵲0=OdI d\w+Ӳnԇ C!{> &C~lJc#.L܎˂r&G,Zlb8ݣbiǾVrQe{#؍0Kˑq/i`%UKn[:kkMʚl yk5{>H2!C-ElMB@p$EN,Fk&v{ؒeڟb"Vӏ}-lEo{I#7 oL !AV1B>" d"$"`DZD1C<Jr2Y%r3! /BĤ[b9JZ"VgqgH:].#YMK]]Fn$9H3r)8I, AMb^̸.. sCSLX7 M9n _)Q~\> ] ]<8W{8wF6<1(Xٲ&O:Va {3l<]AwC44ХI E[? V8_S( 6G&TrAs>XhRpCǏ[7GAd6ڞ¼1zCH`͛)PĕꭔK(/P/KFiė>*UN~h*l]l<5J&XQ8(t|hӗ|& B6R'GMf!tHLs ҈@ 2i>ں+Vٝ_- 7R o[Z6[2 Z~W;.".%]WE4[߮>\{2Jdj+tutah?~Hd .*xK7t!PHxRo%K}tTD4Eik,Dhz+-DdΖ+(]0ZKBSPZs!Q:eCYڜYeEx _ae2׮\M,tfxXd8mC⥡ ?do/w@j=_b$S'|[OR*D^ɨ16%2CXƫF%Ga9y=DH Tc(Ǐ^#v {RARX䠔KF 0!H8-!8NG,g*=X@+S#uj?/χ~ޏZ8s3~X=G!܉le`U0) ʲ Iƹ+K1Jun$$VeibJbfR\pЇBDh/hn Sd>f>Wkt=-evÔyؕg-i77MOSvYZopWAu|;{ =Ϭ3[zD{.~3+#=VsXf^ФX,F N .5ψ̪܋YBapň~VS]_Kj:gQ9O<}vqR.ݓW"]RۣHpVo+,u2&7`F|d,-,#:3=XеDIc V*!s Ȅl9$F3!|@B$S2*xȾI(B[<'R`ّe7ǯN$CF]_.ҵzwjjd[ӎDz֢EgNiJ9iИKoO6$HhyX:1TGL ŶNZ{Nv,K ʆIQXZ-sTޔWсҌ@OeXeopOB~1e.Fd], /5hyv%DG!XS,t0 b!ZWWQ8RK.s[Od& 謕R$mrŠ  xTw"P@(RJ#B7K/ȗL9sxL a5z{ҙLK? o?+?hx) ?]jzJy1}ğ]kpZy7g~I`)W tKD8e{0A!8 6|%݂:@DMLҤ8d'^<)^tFGY| C1 KIm#a+q-0~ZT4dz">j> }iSx;ih۟}tvqv!w4Dĝ.}kD*_5gէ;Up=-//5zۺ1ggf5+a.q ?V'㟯gmZ~Ϫ5'颫 n+ Ǔ>Q,d<]pUͽ.o׮{5Ҳ{*\Αұ&*8A1&Qm74jH'9ol8=3R?9cyç/?}O?@+ΟhfiQ8gg `3뿽 ]Cedw ռŀ?5vw%øZZE 6.ڝqz3S8 '$6My&tMtW5DhaͿxZcـH7>h7^2Զߵ?cفz:ȏē8B2GRTzH13K9KK6#* wZϣbY_"{Άqi1:luv4f{Di?i.FTTrT#ߦKVBFIgj"P^go`UM\ڧEdϩQFB)%|>)VU/g8K'ף􀃲A x0jm 뿦An˸!lwl( S*[B݃b%-et-3ƥT oBx&J-qR 75^jH)\_j;,`ZrViGdƉXU{D4X< MA4,k0z W+tYf`l11ƒȓhx=MLԉUĺ ̾ m=#K;oi7S 6Li\ҳGTREgjԀKEY 1\T@iqiwWdoz?0v7lb :ڃA=%FIg GD&$wZD-@rRXx,#.j y6!9:71!yTIAER`H9aHR\B(R&]Lq*>Vx(S)88 cfIzf)axhژ8;qp-XPa ۂY .o6}ݥ}!EZ]/^ƪ/6@cE3+7!1_1TqⳖ㳖R|֒ ZR)\u9)ZV0<, F=jFU&h4v #ͻ3odt^ot]Q+R[0Egw1yj^HMaGn LxU}TTϧP2㺸wUh;* @NTI %*ž[V'A)X0taE$XLjAFB#|IE 0&b$.,I+ٹgIJͳqgZbUX닁+W򋁫$휑tpR*2\;>zq$SJ:\%jq9|pT\'Wb}b"..T_ \2?U܌ W/W ]JKy1pR/;\%)9p%"կf8Jj Kw,ee c3ZjWYJ +/RS R{5mz :b1r)%G< u.Mp7?ß5 YIo{vUkO+ )0;fSJhD gXB$28 V~UmU'`sIw]4n/? . 6ۻW9s363O͕8w0U#YW0pd0xbQ6jƄHVk2ph0ЀQ|]ϥGٞyr"S;(r 5pLЌnzqw{h[ 8喓n>MtrY7o:d]N^\$W=`9JUr~_%WUr~_%WUr~_ǫ=U,= W&dί*9Jί*9ʙU\D.2j.ˬ2jTP* r@<˓b@Zȹ{')ʞ_g\f5{j.ˬ2j.UXj QԧTQ.? og4ZW0U= pS l = =9φeYTzf0ZCr`AȈ,ydRak"π9ȞQ:"*BT (W$E Oc*5( *H{8zo8BQ4@U ;-Ê:>饍` aU8 W QHc@'\ԼXkD N)%F\W"S-7)EgFIqȋ >7OyѼТ9'] #N+["I,cF{g6Qj* }ɥ$J3k%L<#Z_U#/ N*hVv~PqY$lPɖhg6C:t<^`Pq*+J{_ms Ѷ`Rb S緓xz;X抣GIs^HECMà_)I1EFy@-jWt\yuCQodqbKzv1g"H:5aI+)E I`.&) IZ,=i _cS7C5_R=&J_}[dnT5/i-ҚJJqP+! `JZ槨e^nJ;QnnNm>du*;nTI;~lz1ov_0:UH(by8P.ƙ3u^3I*.18FR'A˵.< baX 8Gf5}/`8t^{461* [Yo3FN{HG/75> ;%JoWL TxUr!dױT4:4kA8 24%GP1յb0ϥouk Cq0H^WKxabw;?C4Mg|`]~g/7͵=iotAy0L=76h.fcKP2u2D`m40QAƏ6+ἱ'19tplp`G #=\SG|r(Wl&ǟjκiKjG]s 5d\̍G#:[s5Mq뎒t| Erm۲r/6_X/JO:>>Fä{Und?\Ĺ`1X11&HkM=IVl~<{w)bt<a5 3= DɼSY_w|r3jPU~0>TS4USY:(MBxoFߖ"DԚ{D4~r}6O={g7Qy;zrV-f ξ.y߾KTdo}(ȟkJSezklJ߶tL|(KwS s"VuӟwI.FLwKU^x,`r@r"m?KM|[MlMM!f-18!'Ax~G{-N0YSM1׾xX4 3"ԧ|Kb놣O<҉ ǚЭ>X:{fv!3hw%oTI3NAiI{N'sgI&!DgV=6KBp&Qlm^kn(dQk%%h $7stfΝ9w6ЀW<]VҨL(ՎR I֌  `K 25zI])Fa08ЍƁE磅( }Lmgj{ȑ_[kd(`q.HbqoŢHׂeHy)xdk$yDY- X{zj<Ūpw(ò?kr0y<#%;4-0 ;?X2P2WIyl`;\{JekNtjOk@(rNAeA-N5jAT y*:3(@E)aܗyBs8&aaqcM|mo 8wC'ļv_NH'=ɃĔܹ[S~χl?#niU}MNr7_/B1BцʕL4-K`!&=8[C-w:YlgG'c1(Of Y{ۄU%Tx'Mo}GEYm32Պ#ɱ"s@8Ѯ~lýEQN`Ka4qNYJնhL$o1eH 0h쓍n6&nJ~bwk/ݣ-ٵ̎UDhl{(#xQ{t!^#F A|y:'mw\A`sк RM|VTqqzK'|h!>aVE*ŢrÞ+g5i@Zt)bZɹj q~nqxh$˩_79@5>4y䪬R|V˲\7s.!"XuN&/1G2{5pͣylvju1 F|jSm3(@1w@lw,&OSlbц&zF}WU=V 0D1{k "DtۙZ btgMqj>ZJlV;ڧ o'oߚV諷5SL.ɨ-S(J5)Pb+|aX jl!apBظ>ZzsqC)Y ~RNJzk$bF@.*H"I쀋+ؔPyaظ}UξƢS>"jLBJ" FaT]bi@Ӓ`cX~8s4ğDdF R$/\a/Bޕd<؄ls:#z]}hnI'J>vO:G*<p:ds>E 4$W <0W_Zde' sNy·%*VC %Srj_Jr@vMfb8lPaBkI)BJ-TS .V[9QM{[]+v0k>8LNtÛNk5 Ȑ\.F.Eu V j*dlI0Q|GJPs*q0vhU42Ԫtӓ!E·'\ɺStr좸zDqؽeQ=>?\& w(or&A(ZW)f(֏J\㭫S -- uAC-%`-gB1Bb{Z \$TIĹsnX_Uk/8UV}tq߽jxvݙe<]:==z=2 o"gC'G<$N+ZАNE}+Z Q1 R2~mY|%XBn0V#F_3WvĹcC1⵻iǡ^sӳמ4+\^XZJ E:j:4$]u!Pl5w&bu>PN0dA ZA|Maa p%/P*Ts;Ps?IosSPh:{D3{#޺J{GQU: $Ɨ l#hJ:!$. #jHYбkj3=,c¤ @AlR]M=W,K/.7K&/mg_CUc`cLTA#). LERz|0^q?1k5dlj3ϸbzG\DՏ8|ڬِà E X Po\He '{߽+ѧ3÷xbxrjqu69/fSm|HO[]{Vk-`&XW-WИ S_ސ5LQeVHTU܋h㭾DItiFrRT_yO_E[&D)Cebel{(5d\FE@RP&.B./)fCo$\'A_ۗhg DFHϩ/m7U o<&+<毜/J[~ׄ]Íq^?xjNOʏU[q"/Ň \ϭP&.x=HZY6PaRv(W OY$G;5q`  #dZ&$ʱ,t}uܾU c[04?^1<6Y*$ﹼ8[|O?ys~uA->mfBqb+w/wgg\ϷoO錇l~7ݻ[O Ld[tbqPPԄs0qLy7ޱGI{(K3 &k d TjNcZNJ>=}ے?Z>Ey}[+>(ɑЍݤɞ_׃Սy.yz֯WP$9VkLE[mP)bul$ɖQbt۔R[Rlઊ.JE G wdFI;%5eوsoQ׺n:v0ꣽvs|6aS|֦wJQ=eLݏ̩IycLݹfQ6m>}apl5<8e@\!ck`rCb!y,:gzq=1A Q;0Dd~HetaT=VN;ۤPfʡoc ։ʦQ(Ҥ+$r ,抾D'w^!M :fhI\2FFv樓51EyF >Sg{㌎zQ9[ zM~m#QRsJl9INȃdI!J0%PŴ ewBJM};x8 LRJcjw:鈈5wگ8wC5nAˡ=L7鹯P&jb>Y!+ 1<'%;ZrrKJ&Gg@O3pw$Ʃ!3B5XRdbF}f hP?9;УcS'A=^:q7x;d<=IGݛ/%' rҰs{,/9Q,,Kaphp2=4ℱuofbKz;^t^sC9=?!Q{!Q7t=!x8IKloIJ0Uݐ `Pm_VgX;V~B~fY{v&: &2V'XЮGS V~#W z;;hϲ\"]IPCǵѡ`T##Q{T.P{jescG~ ۟Nʍ|&x6!8z5P[u9bhtY3!, XCI}eNs}ZCne8]cjmK^{1(&"sPXf܀o:4ǐ|I;G1(hH*`9`bHZyR!5CjEu߬ 3)"Ch(jt.M̤,p-f[ڙL˷_l,N[[ GrG{S b 8n} //["++ce,[R[A_ro釗Kqo^i6WLVV4e81.dozrN_e_k z?f^[xƸl?l~-R>>>|Z :-xն:K>rsϿZx V;_Jī6u׏0}7ſ,}ً-&)8g{8WMn=Rqm9B?-%KR~d{/CRP" tWTUדl1c9#|eETvp?^OrZ~Ž ,+ҕ#18Gj4h0և 'Xi/?Oz<w՛ƍ~$Fm:N^Sa*#i``RP b)>Šr۝rS =+^.Ɍ$gǫ# 8=C3_y &US 9%Є3O)'p%_דeu ٽO3k06T' _ 1!w}=7Wև &c2A_ݚl6/ < W340|[$Ǖmb7jg)v@-OTzl_3큽e$"_j.S4aﮯ5┱#@0h0F"cZuOc-?.ſܹ0le&eUB2Q Y2N w>g\>w@s.%] >b3 )0;fSJhD` )!\݁W7{_y=~U2?؛\/Gf|`qN Ƃ1 >vvFj_4W0׿?۶|6C܃zo%vʻwҮmL֜'*糧'=q%pVv#mOfkb"*it @X mlU,ͯEXpKq E:/B@9(߆Ez޽ok}h_)94pq+n&KLMy -wh<Ƙǘ`j!P -lK5n]ĺ y;@*澪lrM!/ZP sZD)Z2li%%_+Lw*8ЃvmYŻ0:@E E=N4V)ã)pRTW]7G5Ni1pp>@ ^S+v }b[;8,81M]2㣹dJJt,L}; %ӗs(RDܞvU+ m%o -w|i'"\y{hոbכ1ʽQ8P•'E1s/сaN4 /#ckU vh5ZkM̨3hQ9b0XDNPB2-Kpkl8 h0ĦƙR~т ɝ.8QFfKV^ˈĝ 6!9:4rۓ -훐3H =u)' [Te"X*ä)5O =nhSPt1fIzf)aY5r6Z/U( rj`Vc¥ G?pӾ{ЇEc[+ `*_EG« I! X]&K5:]Xd`JFuϫ "6Fg.PftYf!GBZRʅ,Wm#-2LQVYTV9▱;^zc``T2`K.-rPדA]W;S:eXI\(wp)Bp jg>LLj9+j$Z``VچWC=i=>I,m 3=/ mmwPRzjWח_ 02:^jKH,-qloFG܇Qy(w;/"H)&# 2jVV:iJ j|jxGe80/}w2BHRVa $YX ڐL"^ˈQbhnD|9[u)AVQAv8d4 aXߙW“"njμ6.FPoYؕ7J%_3\g{}7~~[g`N htO攣` NrOznyƎ('KF8)5bτS2@ kBhʢH罨X=մqof`T^zo|1) OAʇ{z-C>KUO8L*C K2$MMDw锵F(8.nd< Ϋ{zU+p@Ұ^<& -Keۧt́_UП>? ،PlU S[t8IJzeV y}X!hazA~5kI'\d֬:^{bkN?KhZUr:YsٳW3w-KhYܺ}Vƛ;#yz~HM?yij|Cǵ;<hχ< ΎfXs:i͡O[hS.yH?66=D8Fr9Lw8'mIp'm[d6}s/M\}YoΈ5#9,Erc(AxAFG jJ93]})ț!8ݎ(RiBݠʶ7mr؉G=7.ӎM&!֒oN{ !xLjb9!Z5Ü<#SQih5?b8>(d@82Aq)Ŵ͸)B+I,YyϠsO1ALIY,-,x3k;Eoklj4j 6Oa H]lqOo(WTC|0ßL>18P=Da`s\H$r ,*!% ZPoa+&NxɅTO4SxRj##jOށ* 'F*`(#EV* RsNN[Яeh 6CC(j۲N]|'%ԆO\7<Ξo-N]Bwх(`ūz|~s-LNI̓L^'rZv>>|Ze\yUY 5URP?>M*pUgZsiPZO O ޗ;&BS_&3יo/A'J L?hӳyynїjsӣ~u?ŕmCaU '˞etl/^}[?Ww_ۆqj_\yqQwo GD@E'f?9&I^a^8IOa<=\}ﴮ[&3z,;UtO&Y^mo8 {o4{2ؤdJeϴ;N* }"fדz;g;]lS.K 8J B)I[ƽAA[ךްdRk%%h ä;qi||`BytMUxNN^Aa ZEFgB v2lLb&f\x-U`)V^@ߕ0S5w>n7іG!g⑮9{5VE\xcoJ)&yRHфj<ϯbWQ蕰ųkuZ2揞^X*zxڌ\GXc>ߏ`NO?j_ȕi}dgL@(K&$gy^"g49Y y: ?Y! &kBxJBQmOȳey>=gİAe0Қ)"~sN:`O*+,3,[Azƪtb)7r/'pc4lv8|<2˳i?eeQ I:,UdHK#! BLi@a4R)#-5)O5fXsD1$\10d7hB#`CHZ鄒1&9D+T4h1Pc`rfbKFf9:(dwra{򆫿Dhϕl mWTrj*^?kG^ruGiSuU,^iHF"2$RYVLf\4H *1$5=t-Ip'g:XaNi wmqF#KA6 vd<E:ǰ,9俧ؚK>i<݀!K=35ub Z:+X=T]mP(hL6@,جrH\ƪ"Ei'ī# ZQ9aPv6LlO߁QԾ; li~gi`"` S԰9Jtޘ*F簘PpHߠ+G~fۨ=o5gY,|?䌉KbęT#`ZàVSFHJsp1qV%(سf&{D%o#]4!Sr3cOȚi*G/K6^@{^nJn{u{8&&_lVoxQQ:dD&$CSl0Xҡs8%){Cd G 4c֩o-*Z3\+5.)S[ ")CE[o/MUog1cpJ$q0qƩBct0]RwDAJ=c`%TpgD%Qb@T1 cb[@dZO<B< 9) ,-e+-GsQ8",pKicUMFlԒZuA#Cpn桨`S?iB!Tz^k,+ Ejm w!STtHnЬwaI]͊1]|z).Q0JH(feqޫ*cIK)cAϳְ!jL2L5MM,ƢƠ"qEl U/IyF֟QlJ]m0ecas9yW3jL-br}_ɅF)sSH1;I7RinX8LA;I} s{3KbsJ1 OP<:}8?e@z28y& 85.oC۞ ~By;u_@!uoiǜώItP+muǿ-Bń zkJ l M`yM~g:AN;i9{ni ;Sw po?K6aZ 9j9uG1nk8̋1zlW9̬}ˆ<u+|g7ͻ=}|DX<)e: Fb0qj`Tj*Qڊ/fxe^T ++B< V6XA&K-ǢGSRd"@52V&"PG 31  oJlXPBR( Aiqӽ.t[UWQ aP݈rEKJR )m B{>sCnj.hThl?]FߨϰQ-@${䪭IrQ}G d6ayR> h^>&O:<6WY{?Yurm|pN󔫋jo!?8~+WKpE(֊QZERF~4`h|&u!fc "&iMJ1j XD+$HmOS")k'Q'53g ðʸvZ4cO_ f>~3#-isrbͿ`N\\~8/ʛ(qDI-6yIVn"E1`rAކN98e6!#\nI:p&P0kn{ PbT&Xr&P]G>'jRLͧ(I43JjQ!+2^RNk;Q(=@Ih*D[_a8BdwǾ nGիDjhD)=[ef#V-fFJzČ֘V٬&NTۓ9dOL(L m Eq){|s=k,Sw0nB 8GW3Y۔f ?u{_g]ʧH5/W:; =`. ,$ 3xźyDŽYVDbVEo(3ޛ ܪ('WAZnx)DTޕ8(㖱l&6ηb&u&BT@R)\ =ۓpu>p zb߱L>@P0Tny` X|MrBR0H{2!Nޛ0FJc-fP*$5 8T5PSJ\#5\@t} -`Vb[j|iMfc NobժƷe&QX; Hcz# z)_OVOovo&0tL}@z7Iӧ_F^5,RČbu8398?Cv7?Wϟ^]cg='+[F;;;Mg.gjo;\߿Mڰ^6)FҘ/.fɢQN9fgQz}^afz+LswYۀo+Fz߮oa6PHCVz}?KVGozu/sj}躖K_syu~zq|_?]7:zx>kN);#ƣgWQ{+VG+pOoG <}8n|ů ASTWO}0ä8OOYHl +eQrxm!eT=s@S;aآZ M\BN}Kzq:BT4/?ZA33yƊb߾9Vq|jik{/՝y)Nr p̈́BDkjOdl`="¹VRuF-<-{VC_1l} o?Pi7mK26goYRmϔj N1:TK޾DFAҝZ,y+bWsMeTKy{CJG$G`FW z{J dK $!\xᙏ3gݍ݂*o(OޔoMzt㢃rTl, Q;  s߳ ɒkHp%-8/ΧGO_-{`PBt%yuڣHxŵ6bYX0U:eXP5XVxm-Ņem 8p1Z,aJ8SQlEear_vb600v!RE>0[Ly(/ pli+l`ّemwǯ$o n#]w{nl-+ :ܩMIZPRiШQl/I%5(qt=TqX? a{ 6b.Ev)*;zه1Y"8B||x) Z{J>벰zbuސz9JE%:@ZB6jiXtӺhS] IG ldTC255k!JŰSR<,4֐IYT `B {YA?{Wȑ_0fg%}0v11ڂg#H(uYERRIlʌ"А1yr 4`5|{[™%Ռ5"IiS_Ny[г'ΌD0FaSܠOh8IfǩMa&/BqRb.#)}&gj4"v.P*ut>efGX &DpM')(LYK;9*/Xͳ_%P0 f-:mqkBJ3ɥCR2!Ђ\6rgYZA'13#N_lj*&RjwVhMKF¯ kI٬D$QDvcKDU43{P$AFzneC\j|=DÁCEGƖ) +Y;=`-zMV E1tief>q(MV^SX[: zܬm)np09sH[GSORz,sY~,}r=j:Uti<"3K`RGSJmΌ6>)z# UH:i}*D.!E:K2qYRA:#Jgj&-ƭwzI2erT\zEsFB4J$D!baHp)@VFD&&*oOI7( 8(J"+BHo:dTRltq۵WWQiSf9yG},>_HF7 J1x2:#,X$ck4* "^i^|M̨3 G w";x#zx6cE٫_C"W f /c&+kCMϙ\~)ANk 9PNr+σeEB pA|!9:4!ad@ER`H9aHR\B(R&]Lv>*{+S)88 c`1a͒0-R0#4'(xq;hʣhoù9/X01킣t.$υ#ߝar>`,Ga:YkÀW_ [sc̱|>c`V{ɱ6$@Zr;Aff7.·C>UQr8~ZxDV_2,/ёahˤ b+t@Hm_jRzV bAGt0BRh%+DREM'DgmZt73q)3$6̠{!j%\Rwkrzwۃz.\!r -%& p;.L?gN| h:iV1k""N0%)悻hӈLid =K=)SDc.DpCLDK9 Q K{4OM+e胝X䩏3`{:Nl^0祗|ƒ\| DB sID0)N1^B[y{ټ z{#TReZdR``enb׬/7Ay!+?k^MDw|TD-0Q)PkczpwPA6\6.[~ x#NGn˻%+[=lC=zHG#ƀ꺻^=KkR*8ʝ7\ \WpV{1gz4‚SXuwŽܣ;Oe4+}0dCwͰB=NL0R:VOzs ϵ¹V$87ģ I;Íq|-xq. oB1 ae68KqpDPj#"C*D!e@eZ Y$"﵌%6MVHK2鍜=qy FUFP$ HimVy%< .kbv8v+|WN~uȎ 7ޜf*Xd:C']aU9;.g=<1*༺J^M?`eD#47ɜNra0 {X!rrF磨܏ʹq?ٯf Y3,ull w0|P]VʽM5TZASJ! N87Jчhl%e%7J*H2kTTDg3I %FW ^yyJ&]ߐ&Pj켖3lV뼏ú]oݝ{Y}'Y؜n8{NZ(el*1T5)[a| #M9U\,kϪW66&r+ie \]ySĚUZ/lb$h?P:US.z΢QvnͪB(CɼYWmxԼ,ƞgWJn_)Nw}eY\ym8Nɳ-v-keͧCHTs*ъfдiS{jJwM:f]إWstbM䉀cAY(BcL.H1ܝQn>枵?ˀO{TIz_gQҚJJqP+! 2Z&>1x·7De2yKEwleéICkd.rVm2-W^-UV\6d_)lzM#w7dx&)< >z8"UoE_ptNQK\BNd6jOQʐX9LizpU^ a08@o&zOuceK?{F[n4U$lnU.Blee냦HE[WYr4lr9OR)ǾcãDa~ZC~}?n:Itwډ3sg}uƞ{uϕ _)XޚOz>?7HTW~R6)g G^ 7Ts$Geyw5ξlzs#/immԓs@rxfcY]ɦSsv`ib1"djc:{.ɛ< S-VH15&a(1zS hɉց]2.C站U?Bhm_@( ֐8R cFt e :O/PCa /AoHۛiU/?08RYtl ąUϸV\)tFߕ\Fony<u\!zj[jуV KCċїh 7 mP]Mairee Yhc4UHٍ4์=r-kpMqg})bͳSzXS$HJhB>T RulЄ2qAͅD\kLS4Fl0$ջYsa˹3:D!Yӕz'Ss?R|kϋlcYSooO7ԋ0r&,搱drmA ΓUꒉa.;qFZԙPݭxkټa6M|bc GJ vlV5 `ls]h RvsZF"a9~f[Nwj tм Rc+hqejD`) _&`_\` |/&P}D@щ]ɾ_쫉\[PI(+(j@aP+_/4cp܌,!r@b 9\ VR^=ƹ5w c{$h׉[C!MEϠ^S/J{u{8^lC9ȊҫL$XLUaOS%XbJROIQky%)Ŋ.z~Ao Gl4% +Z3bmI٧>"U lZ,@B~CY1Ĥ᭏og2[%R8urǩJj<CRO!RFWgUi- % K 1XC)cHUg.Us8fWPAa`*!B+7ӍZ΁?C9y pO)cMɦ%g$CF0JP|4I%[;Cv6Uv;LџbՈCLdPMlCaˡ[/4UWkØV.ǬIE$ ʢ.; SRT0S g|Icɋ>ܛEKC[=wG2hI>Q:Jn72Rlr6u՛^ ? 5JS-LʣsV6F 85aOs9à<%&6RyBx*'$1^9R`-tZ84QjZ[6byJJ$R%Ebj]Ok'J-@ygȞӸ,:i`ֽ1sjdsB % b]kQ1Hel[[uFo4fKjSb3:IYcGRJ`izr=x*_WLr~ƺI aG`Em8Ѡf7؊kdlN?NX)Ye6BAKh f:H-fr{rE١sìب={($l@l#CP\a<I!S5h)dr&Ibdtʰפ~;ƹDaqlD#]"Z.MSSQPIpVE>j̶9& X+"JF` )!Fg[3̮RY9+.p٦<~sr_VIj+4QVrl\t".qqwr#%C: 8Hp46`z"i (21K\|8<:48#{4{A|f\Q([ YP.=\".W@ >Vv+˞h}wn@6KC L܂ C8ETi?R:ߵ?y8mbw~I|=|{b6nwOI?c?Oٞz?G* C盫 SҰRɊU+lX7U-I_|ݒ}G5 (pn fss]Fq:J5֕hhJCi0SI1 v|{$ywe&u{QcN|`\- .N('W{"Llޙj[t܅|" cJ16bpYIvt#)> x/1ZHĦpk ObUcGH$ԀE"!I>oV@YWKkFhUL¶n$ԬK1fww[]}\?^γr޳ԯ9<Տ ^to*AbuK>ah2OuV.^wdG]|O[W6": $}>;/W4'ʠ^U8 X@ߛ~Jy&+YtRWdkۍHҘyX ?xd50/+yHHX&ْXHJ]f؆%v7;OfE8Q'3:v\_ _kj,V#ݩoWD6\o._|tW[^#`ԏXI髮y>uk}.~zOO[kqs/w~ؼ훋_v/<f7! 89/۱zlnNNzs')jpO7nC7dKnnT3fh'1i[?=FGˁ>nmKzUnoSN۠zΫ וy_^mU>R |GL<~U=~ 񏠣??o?~ m⋧zE#FW$'S<}2|`?Mӟtiw4Jsz7{C˽)*UW?yeo[':{2=fôyVWv3U}hh7煘?.R/ۆBy ίxdZl <|>zWMܫF p[-FrO%dH>{AvkzJȉ\x:5Rȼ%Hf%7_xbҺt S̪5!AEk\Kf#EL]f΃,~{Z󮬒e;_fَт6`Y9pmC?(#%9mbhDWlrᚴb:]NҕsX `Kv1tpY ]1r@CWD&%_/U:]1Jzt;$`9zb]u(*PJQ-W.dSv !+Y]E]ӽǙ~K wޭ;Z={y z4p{ڸ#GʮMճ>jbц<+:&_~xu{5Mi+g]=hB'Sɳqq>W}oX:[_~u21_W nn7QnV+sd|:+g2vww:ylM~?HRtvVu 5F'2=#ʋϪ?Ubppj1R\`+"FH 1IEJMM8J3M~Z{ &݀fأ{2c>Lww} /xe\Q׋'_ν'kQNܘܵS-j~ۉGI6Ƹڛ|݆ZzgJOȳ#}vݪݬ;"ն7啺i0nȪSlmz&T0/@$(owebf5]}ϱ>3.bZȍ!~] %l\fk ѵeKrުu^WOǬ}:cwʏv{]vk4|~ɼv\t|op]fd]Jz\Q w͊2rQg&z`LQ4PjM!2r)f oڛv#s}hG[ިZ<_Ѻ29цsPU,Q-b@ Rڎ(9ٜA~-6ڔo ׳/ߜBRMDc}V53xS()%H srKwAh0q1^#1U06T3F+PtN1ZrV>OpGDkKԷ7&{nmS1Jt@=ol3 |FYg,4D2R.FaL|n6KlйèiG=UoT#FSHQ"QWJ*~$nU 1Wm*x@֩ X2L>RORBu!w$ͳzΛk#iH%5놫MQ Ar#钍5w:>|X5b'ܭ˱'9fqu$~1`ی 2ShJK2!ќ`RI{mg͋jS-+5s[둇 -U0ss`J"s ]0Vzk#Ҋ ӝ6*˂whBJv l$S օܢ i\ TTPtC[B Z 4BsX?ҀO+H%eìaD]v7C/PA_,X]pl5M):+:̆N46 RcN)W̙l`_yJ>y iNs}#Xm#d*zSJ4/9fH57]Ґ?ȁ;1m2R0sb΂G Ät":T]weC-TE|Ȩ 4XWy(y3@fih!\=+AtޮŊ2YH)(J892r$XT5fY;!JPA/u: q `,m+_CPqZD5=(7iw3?bKRj0ABb|]qtA e͜`3d5>)> :fePH&C@U+cG=D &O ԙJw8@C{5qJ|ldb̠Rt8!a%`ۀ񈝏:Mw%sɕIWo#f BX9Y4<#y0 /7+XqCy[ɐ$RQd"5k*fʇ`],Lk4/%f0A 9`ɐQgk#ԭXż#՝ mG5juzuii&A%nwk>wyKqǷXk3tdD4Hcc.u%t 9(P"QPG݅ZRepk ) 3 E-`YۚBΨhv5+ڱ!," |XBTC+PS`hVH WʔыA(Php9#J&)يG+n4XdЙ$ fZvZU7J\P?Amj DQ񨈜'P&Ea2,Ƣ cA8gũBPmʉuBRğ?~ XwHgY ggM'ѠTfwzJPk M譩+ UDs#XHf-w;$aS6i` PGCk6ZY5S`Zoĩ )2 6:X?n@umz󘖷W2d1ASH7`3 = 5`.m ߝU-|,:zM}d昴58)knĘd`_׳ʌ4IMG <%2`O~@rXj9lzBg 37[`gr5wt `J`]P:\@gj*3RCP͛&fc_ >/mo`E^1H"^uNw d!0#!z7(o bpt c,1@ZW#Q\F4A lU?<66cR=ڂ3(7 $ƚ5AUJFͧۃ@LQXEZx tmKzzdރ [ۃm3?xFp-4pQ kB OU!:ZSDàegJ ̀|12 =pe6\HmhJ7a#U2'Yk9NlR- ]CC,Z*cQ4k&ɔZ!T$b4uIJs' \5U~FzOW,"k0BI> cn/t]-gg{{LTum]kÛi?~xh]aRˌt}3 .`!ujn}Uж ‡mw7`|f'ˇg\ ϖ"]q+9Lo/?H|еm|,n9X\PX77!zכq5_xы꫖M-GT~ӆZX'7b#;r],OZ&䫖ԯ m -B\VɄLZ=g6{i3=;3Rw>[ybNJD`-:D|t5ӲnO7,ιV*uh7*rܷ^gܑ,2ЭyVB<4[|-z,W}C”3*mj18MD. 8UM7ׁ%qbcgi-](sjb&"7>G~cT7Z #pԂk^po¡nƷshBQ鹠ӧ7O_#7v):ǰغO0fFRO֧ܶOCާALs\_mG&u"#C!n~/Uޑ!J]#dsNrԻbڜ^=^˶3/[6Rhn+'@mﯧp'S'iuOzJ+ؕ$9=rj ;BgsRR WYȓVb-X$aRRNJYӘ"SjV)vnzg ,h"㢘yD{z1-IwZV}*w?ZWic]цQv]HU(ڷ굶1Ҁ!㢑+EsS6]GWF ]0Z9z+EWL JvG]Y^銀C1bRtŴhrQ+꣮1I"ࠢ]1+6w]1%被>BD8:9cW bډu]WLY^*Apލ86u ~ l5]btE1ɮ)ϮҪfZks]@ّZp\RcJZNb,ВYr vFNqW83Òc0@ր ]4q+DD5]]A˪Z_'kuv;>%fQ $ z%HWl:}]1չ)*ꡮA8D9BRtŴsQFU>ARvEQ'EWLiA]PWgKJbtŸHBS)ꡮ|^P;%GWk]1 )]PWTZTvE' >׉ɮ6d?ΔPtG]hIQNz+Od Jʛ>*jcfYHH;õ9oX#xhOM( kĜTdYAa$ Qw VR-'}RB@0,:kO7\Q2vbi>p W!O~zhBrַ\?Ƨʣ2J7>H30>R:Lk”̫egᔭ_ʍo.Ϭ_w7wj`=ݵn\Ӷs#y7x(4[Z)36[?ȍ[ t?=*y6;ciԂ쎩5&X$΂ҘD=>ȍ[RꠘO==rL]^˻oo)& M< i\;1q`y*ZsܸeTT;EQ<4'`prf3ni垀eerIc˪CMlś>l3Q3WҸcmstZq(w% O[/g7ov|wH{Z %#ʫ#xGiT Z?- gw@D>yDMYyvG 5$4n;.o4y+9֏-) .8NõL |Q6)7&ݱVT~>Ž[4L/y%TXml6p:4![?ƍ;<:}vHX^.`{*RP-a#k| _z"{ Rg nHZ=-@nMT bR%uQJĴ!*1e%UadxҖoy^5JFm7Q7*p8T0ϟwe2oJ?7 e%T-VQGƧ6r蕋F?Nw:K g䉉$grK^MdTjK:V{D1nܑ'XHM{A{cdNkQJĴ!A&lL0/S2'Q+>B+_ޑ*=ڭT%78KwT- Xtڭ<++䱻++]} eB2"#$O܃iO$W-\)/c̰bOۀ|URolt-96UU!.0J7r8)d87 g914C{B8(H`>XDaOtaG6ܕcL$ +8]YqN]Yi9]Y)cE\3"X qWV\,SqWVZcwWVJ%@`qWl=]LPJIXvWRB7 BOk)o>ܘogׯǣnat\'n1`y‚.r`=7Zj(U[2DEE<6^ѫ\5Kt~kck_W¾+v1ͪnmȑ Gf ƤwHto=_i骨g6ش+N{Z(NY/6vz"N;<\s›f Q=e 3tΉC 81V4I}AʯW+ka:}w_5|@H5}9c.]gs?#X,ŃwwGQTcn0z/ .p<鬬1>T[24X= wr{9(=Rax2=(M27I% bhy 䮩sOTCj9Tt @]*Œ^~V&DH !Ԟ NAg1'XyyC/zp4QeFb'9SC5P㍳r#ϯ}ڰ*[s`_~}n?֙X+s\:KQRҖ-5Sei1u>= !)^@xL.q)?6n햿n&ojj6.]ty<\7 i2WB^<ܹڱ/_5YcNe r ~}v\y]JPӴJ\6Ჩnjʼn.u4%kqmTKPi-M۟5{@ZNiUX-$ErU41P쇎֮ve GO @P4u۲FJY5epC5U(n pj n~^iћ|>ǃ{'XPPoS b0RP gWF)>VJԤ1֊CKA j]S"<7g[R=tLyd\y6˗>}'{XўoPZ(8@tq;|X- Y}l`mQ [ʙr,7\?\/٢F ^UEjwI7~Sͬ֌59{}1,c^IWZ_OQ \[05+Foz4Y,Wo:t q:pF?{ru73[7揷77{+l9bHIGקH1O^a_^_7Gb̏4 a&MNT҂B9-!ũ?LZBaTbj5RMiiI" b_z6NQ Mi$'P(y}ºd=S;hlpU(&8㋯LN']>phiIRwTd$s8t*$HdOeAqbw$.yV ]uS E*gzN/[3wϢĕ,S(_)ųt@·?LTp dȻgbPJ]eYӋY3ovG K3 RFvaPLkYF`Xg'GC0!H*!LT$yVꙨExR"gYSk,3KvGV3M%j ,EꍏcJ8z,(IiYՒu-EänUpT KLH ƲELAbL@r?)߄GMVZ5_Gӱir9uyNSقz23ǼK\-ňTh^eC߳--.fUa#X ]?\ U T1<`:i5LJf].oy2HEe*)*!zD{{ 0 eQ ^5# k^Pvw׆s#)U=M6٢TFN{t 6NmsDjrvz]_~-[_oU(5Q-KS՛ g v< RTJWBG1vt=h|%RilO44ǿGX^R*/5]@ey=6@9B!D]k& T}9HEA! J!5)qrsQB¹E5RW(ԟbX`T^*$hEeD!jJ zq"0X 8}x'Ak/9+oWW ċ*uqA$SQzpєvtjVbus- 8Xĸ˜b1&D=[#z_/13Ědߚz=gJS? b<$M QR r 6Cxn:B]7in *x&۵?g[hmۛ+xmRbb ӚϷ=8.srozSowϟpGqlK0@:}ht0 `&h6|ȧ};}F}a5.Ʃ'nz!謟4gP/K-2j]n6Dw=x\?>]nO;7fHO={l˵0]Is8+u9`%s9L;:He˲D/Ͽ~@ɢ$RxeA !D&áDKoexq~7A/\"# BxQgٍWYD4W}U*W?@(g9E_/F/cyˎ`,~hmy҈y}IxF)!?4[鹕 l'a'TP0JSF4K Wj"p 8hҩe~թvʟuo"\k9soB>&gAM(c- :F)Ϧ* <*/bS)0!RJ("U1YAaNS!1Ќ= Md- _LxIkE z?nۆkw/b8)%! A[eպǧٯ\?zVĿI{]7>?&*+_>c߳M~#tS0 "P"GZHEefT2{cU Nwِrg}bܾi㏓z O K@˷ #CQC>N̤\CreŤHV#" KFaDK5ݗԂeN8Sܭ0mמGV3IT!a% e: FP Kװ8]Gm!H'B 6@IPTXw(ȒSaP)X(E9؝FsIz.-\l̾ CY4Nv4 F%YBMY>-QUf󢼵QZ>럨,n"K T B!Õ5(4pԷHUsM ^,~`kFtXW~k-2ccOCVnKdzNˮq !]ﰼw.VVec c (jz"͉PzXܢ˺sF9IVwlaJ^^:RLBh5.tQ3VzHeދĈq!_>'go $K}ŵw2i-AFVὪY)Oѳ5m "8R=*"tT'^A!5ƇH=((+qXK? +3WGRL%jN) plٿFrO$(< Z݌*&"P`{S 7oav,ҡhS' q:nXm=t2bc%WmM=M=xCGv8ZxvU4~wn{zWKB_28H\.d`Vx8Uz"v[Z͍Dom}3t,^gNj Z(;u2)ݝERw:q a{ͨ5 ,|NfQs:ab?c8꼤8~r6e=GB<.ݽ= ݖmLZ([Y P9zeqe%?VZFbk$T"qxm7Rc>2gb;nk+mU{o={m^Dʱ Dz^L/3 >#]E[hť!bik7̡`֑m*,>C$&צ'"D,u`ۗMrq>ڮrEcyޔh k#JXH=v dV\̮-&C?IO^I*^V<2hH NE zFKvʁqRJQ]vqQkOӉ 26/$LhiU2K/TO͐1}'[^h}>6`6wLcs:z&,oY(o(,Ϧ[[e:YUrQ9Y=k\0egb;e G=wX(nwY*&SRVw tXIY  ,6q/IuiLI#WD|A}\$Nwa+VOcW0bXn,ql 5ld޸^e0!ag1jYU<^va?!QnT_xñ땋Ѡ5+\ք{JWJY]k!Cxpzo R4&^Щ*(2y!q𽐯S B6ozV<ޏ$Ic ("n71[7aMۇo䄱h6qr2t]Sa .<s5i8ۆjO<2vo^DP|^uAqf;jLđ;bRcLtڗ}=>xP I0#ТSڼ| !tC<1L[*6|lT׵Kh [ ͤ?rtv1 +iw/B\rc} »vKWW3W"y.~*Oc}OUh' AKo‹e,,FfG-{LNg?P\%'DyJT׎ik!;ybxh{}꙾!]2D5n_+.U-H~Aem֠]>f˫O!Ƀ"nM3H`^z-[ S{2%E-̋ŤdD.PRXpa!ODG"Of Jk_΃MCY-tV5+(:a G w܂3*CnՒ x8>);.|޵6\í-̫B0H!"/ V O)3PZۇRJp $$<ɓKzEr=!$IvTp^T^yI8"YkX./d`f|TMTM`gzw_Ӿ㚾ei e=fs/F4^Nd qIH@N/9aw7k1s s.KGF>%*=HU:P Pu^e='PҘNV<`޶yBv.mj ׹ڋOUP" Հ8!iVZB%|OM>"*qmB J}{Rc%N:>e:6n\7^%T4"vfܜJ҅F|Yzр`(u)r@WYTRRZ ~iԗJȇjIdy|ږ\+khn$z/o^3(x` ^*d!ssy c9a!ƺٽι*,G瀨) T4.䜈Ў~eMs+qE!5ʋMU;\LTu, dJgt&HkL=IتArY[xa(H: 7WUDT0@p"ȰRҲ,޾sM\]hAܵtYn.8Xn1%+d*o.)/3UV&G=k쎏s6tYJxCcUDiF DE3>d*qՎ'eNi?5P`k){ؑTOf Wo[.O\Hz'B.vstu?,'CWNā_р}QD.2 *pʋ*Z'eP_e:VCN8>q_+*w/y2iʴ:O,qۑ+z*- QU~)VCi0A+|0X,>ݵ|~XIŗ/*Qze'ӥoe)&PPDw_=sRQm׼PUzxbI_;rЂVa %&%7(/ea0Y3WV'<_*# qc G-Ѻ2 Bѹs-\;Z;z nJXzlwI6t;$tJgW0~܋5.0bG;NMihy^#C`j&Ű1hN_DBDG R!!*ڳ7Z`Hlk:k_[R#=n){Yj׮<؍ݽaN(~jWn'֕+JGBFrKGLm_:Ukab]úR!f1bjbuCK]f8}(נ╾/& )9brԈo롦EzX4eHcp5˕j˿m|s2-vwr5&=e Sd,ZBIY/0w*/5M+YTX䀳JW_klY%61\ĻϱFՒf:*vOU.SYBN[dD>jwYp2%8u.i7t>6bn4#PYF6{k](4fOE2a_HtcrN:=82?{O6_! LӮ0,fL{â˒BRFII\T(Yba֫z}{Y&1=6n>]t/{bvܠj"mQK_y C{L׊qԢ$cN~ I ,F)8kXՎH)ad3 YV9Ndy( sw_*.s%|y jm<\_W;N{gϭ~%Dw|tYܺ^g89(&h7׬q Gl`Wc;Oې-̻=dď ~8E/ 4ɲq֍sy::n/Wg8JVIQ&( YdO|ȀDʀ]g$OeQX" 1M=juUƹYthw@U6uTg%!`G^O}̯Q˃5άR?0<֭)bu !%< uuoŅBeCL\59P9449cHk'òޙz"iyx/V^ZLPÍ籗8?hIc4R˄z9ω"b<8/Fɓ7%;9dK6;L ţ2ԩ}כ͗]L_ M`K3m9I5&K$`A%K02eTȘ0+tZ`yS!#ʣGYd/VVr%O!bA (G+ 3&b8J鋉ڇ/z0GD.׀5ć|17P|'N*N#$;ZaʾpRKB)AU/=vu}1J$4JIW*g_q8Ћ-^_pJPWp$9se =]a؋'&O* L&ɹa 1s}:%Î\~9oQf cb'CQiѫo v>a]2_caRuSEI['1QA.Ac HDXCex[ӧ]iÇ 3 Xgʈ3+>87 :XmFȨYę9qyǨ)\Ի?jbR-VrubL>]6 urV9OBr"7(sIJpGJ(lL3`XFp#kpBKK)^m͜DVvM/(\_ʥ*AThe㜒](qA8QyE\rqp8GC޲jgTPJ`O^ĭ=Vf3wDϝ"R^^;=g@gA)JT֥-ŀrKc1~hGr(vb^Mm" ^F"P~.ĭp 8?~?iG!$'|3ʰI<;M}EmD\cmUcd8릜kY ;\b'}Y2O}Q۵6LL]c9 :BbA}VqFqލ~Es,0OA{tqB AY?#--exa#Bp@)dEXf^GEMB\+n'tI8txH*yHNf.TD\l;%M'!:eo4H5LԶwQڮ~8F7fOb6qdQz?Smo n ^(u \wY m.܅jkC cۿ kji/5h34%d5LGNH4jD׬jd(]ܒPa=6b#U! CU `!@9 <c:&P̡κ~L"UWISmr/1v6!8# Zb[@/%IJH Mp˜M-9.2!_J?ųŎF͓ң}a!='#1'z3 mC3'AJAHb^~D>X(+ b5k,:/:B%Y(x*#;"[Ϻn@11ɷx0G`3!4O }" "׬[ZP, Z 0.A\Np&=c/Fq8~!O[ r:fb%5YDQʼ,0#/pCzt(`sq%))SH8fȜq&?D.n"XJH0+kTJ2]+]' c_䓢,L^5nWf_ۧ:m)0=u G! $VȯY;˔EB$237YP߸yya d ϊ5,j;+,r?N`:a CV*YMj p )[ 3@ 3$܋C %BP^ -p)|}sW x:G.;gnZu(Ai ]c ADmo@}.f/4IC8b|H  ##2bX'xzx\-R)cE`izJAIC1@}ZX)P_y=$Z!Cx:%j(w{ݐە 䖄UՖ$UWu3FC=dI.xp62&!泛ʁ2^\Y@BJqD;|XE1T}&` }> Wo~{-ņ sk4[KpތZI@:x:,~k 9|t]~4(ZO!Y<~X;@1>wUԢII$MTE@0)NC\Nп2 as$"m3`3 1r*SL[3%׮3gd7+UG Wfj{ez?~:In73g*zWPN3uaR^.}Ծe_2A}$Q]"CR0I9Jj`r(VU߅PY)R6͇Jnsgfm`dBއ"(Gb,؛*lͥݿXiP{zérr T1#L?<IҮyMEڠ#f'`>˓Fٳ\qN`GV&)N2*ik?J'/qjȢ vF=+$_̲mGP cAnP@ZMql5b$Q'I|IurDYH 0X^2L!12A"c@ $D!# $ TT]J]ܜ:~ܦ(V׫ 0Ո +Dp-)S CR()% w @W2K"(]z>E S!:82Z3\ԁzuYq Ϳj2QV{)q2unw^;ih%7n=GJ>FhArC?'Tﶔ015Ud%7[HbC)dٽBw̳/,psU:%9Hᨅ- 6RZh #¥.if k2,\^Z|hw9 \#!n80}AܾdHAN\4N]GAW/BˠK;,]AM0?ΎZn# Gڇ+'#.J>o)EY㼘_ڑ;?Um)s`'vg}U LI5VTeonK$Uy PFmf``}IBR{-6Qq!J)U)tjt$fKw.0CCdt8AQG.dd=,™oNW΢:I]Q0k(*?V@8@ei+NKנӒ#wss (AY o7}]?{cT}~%qQ #B%:N}C!CB&\v"#IK[ŎwY* ES\E/a]$V29߯~|wukv_(|.(asy`?복 -[d]= sXMJMUB20Ēa+%Ҙ$A(@H1!g!$!X8cͧ9V>VJƱ*$Lf?\eIt#͕r& :낚gJz9_|Sb_ asF{f#BI,Uk}^P$"3H"mee*:@ SK 8{l"\1E5F`K2${딡bD%gppuJ;G \J  |(QN{h4'{g8{(n;\pm7\MlXCB|g~[ ?m* D?:b_D;Y\O¥-l3+}Y]]B.sfd-&8<,k?7#v`jyyD2~ؿ9?F秱邇==>e.%9U1dr 7$-gBgVX$l0RL*uc.°l,l`51UhEpуh+2i]lfcՋעBc%14K5|Ri+cp@$5 k[ tEӋR_6R~[KXK`R}XvJHxnm.?+.eB+O[ӭSJ%A=_B2D-2IkE@4z`2A\fk'O"`NxU j0S߄w~~kZt8#^KՍx0r*C*9t_p0M@~{WOF]`.jO]?J,\[dBd=[>-QHDCz"hn;n5( H?fHGN#cr<}ӎbf3;].DrV_#Mp3 &q}L#"Xuq""G] *< 5آ(ȜI Ǔd!r~j7 TM W)Kxun{kq7?i;z +V=dX)>VHz^Յ'c6NE#_B;VJU ",{`z ޣg@8l<>q!`K9[5v,1/g S]w-'v0Kn_Sב5R\#^^u5?+>jHxc)$ip0ei5]]d=v!vwUc?ه<ۦd"/_gUgaݪ١W_@Y.oW ]:{x8rEڌ=#ELS'14t6IELdzDŽtQwל?p+.>A~UM2H{!D`B͐!E~7JR(Rj4)S~Ysivwu'3TB3D6C3?"}+/Ne\2"gd8^̅X宑,4?i>fo3x_7@:Tv<11PKD+1k D< 1Y J ~m~[%|rŚ*]|۪(,຅+ӹ&\\]_~ fJ 'azlȡ~Bk$_/a1ޝY;9A25;fF4 8{./TB=E7d|=d__OʑX`x,Y6ȦduH_̹X̹3ߞN.L*T-Ի 1S;&*#8%R'85ߓˁ ',^B,n@q>;|&PE}(;7α;g;D+`2a/ܱ~PY~h(N}EzX `un&2R#zA#`} X\\:?'\(ʣ@ * I|VkszR^jSâ󽅻?ISK7`؈)e1._PSKb}D`~kJQ}_~[޾4ƒ]F>ܐ˸ޭ8_XHX&e68@bZ1!``1Z/X;kUw%Q%E 6f[J9QAFb=U=xc! HXk@K$D8DD 4n]%GpGfm r|h^hуl$8J#Np)a¾}Mc ,8Ȇr5+TA(}2RM(!]y#ھ}Y}pAGVYn=oӲ6IG1mWauû@PO@#9g0; rU.K]1{U%EZr<5jL{wgu }oqdWnYYB9<@x1hxw_nᰡgԂؗv D=qpI2KEO݀R /B&4*+Y<~L0|Dyq3M"Q^EO~?NԽiw`֩5;СR8O}#1o®(uh].Ze[ DzUmN 'B\X aO%a&4`Q2C/{QE`gyXL'Ș o4IƒRGxME~jy5gq)=Cr pn708tiw0Ԩ2OR*>AJbK>U_ceONfSIndY6-FJp/] fM~\G0mpw Nk|x|f[b~lǏN8Cˋ %1Sr=3H4RXcie{@ !B ɎpE6paj=OF:f%#+}zX*&@b&q|n$W=@ȳ3ll-E}g73*B[@_>4áTZ8)YW5تճC[T uuLucEF4Oixx/#c#JHFiR@E>NɡTCt޴lt2|sB jl043Ep*FұO{2n`B4ӹ̛( f5gz8_!;GDf= ,bbШ,JI[lTy% Hbw*"3J:Udqk*ݨ@}cvMmbP4PL*.Xul B3sde<qriyt*VCyLK7SHN®wtAvO6B˒;gR* w 7zV+R3 7TNr5U {0& 7bL? M2V|Iw4JG!{g"9uADm7j+|ؤ}RRZw NV@Rɩ"}jTZO{UAx8hO&C%d*8Hn@dbEڙw=jw-c}'t'd@6rn{0Lv]Axs? x2w?&Ow}Ԭx  Qc@SEr=a]/Ȣ[/~NOg o`z]ZR]t`~M%PxXEZ*282(IS՟%; HO;@g6q.246\RšؗJ*;s^U/E3oMmOM̭#Hu]o#M!1uF͵Z=ZUnHJ6:LRݫI[;b-Gk\~'H+ݽc?q5&6W4d [;XFF~0%z3rTlO18ƖyueL 5\1Cwbvf7faˀdmrʍ}jcڈ'Dс# o;av|ɨkUlsO/`wRZ;Su!CB .Nnvȝfk~f$h +0*<H+5٧+j){d}YS}mS}łDD9Ķ8OO^/|*ىbo?OGmÌgShl:X ]c^ƍ%ǎ f3t g6uӴX+PEaY?9> Um.d:rG_cM[KYwNэD'gZmCvcRǎC;`MnY(j \ڱ6=Feb,ݓ'Gldo">jbuo]8e  POib/R;$3Čn؃D1pk료>aa˙{ԙQ'3U`)ZPʘ:AN(:} rw5Rw0VovSvZg{Cڧiϵ !ƋHOwW62=5<#C *{!Iȟ*1A{[_xCܜ@67SGJz3oY 63\GMcݛ6Uwvлk+)1%?u6)@IWk++oA_t`5Դ3C֩<:#ԩ o̶r#D04ZRT%JƺWSFlYOe{lI}ޕLmHy}-d:.ۣ;ȣ)Ez)LYbֿ/YӴA3^̀fUYp.-QJ] \gdrۨUL` [eEkLsC h |ё;OK/(,EQړr.E8?Q̃G޶(-Wc?5뢟NoN?hg&>c vſF,̴pQ.k+0] a?}Ubim~0VIgE9@7cᏳQJ,JԂGWh5[#9{i8 dɥ,Px)s[b ]hSSv9mh cw霘Xn0ѡ L_Si+;q"vuU}:`sf6cS}cԜ]ptP@V-k7WYt8- GXw4h6:jNpίu?_)f ŁjT*] *YV+W%߼fPrc1kR>KXzRRljw()ʀDӣrΥf0=C/"Y嬣v:+k#6Au$8eukbgF;~_y,]ɰ_,~I5^%j @hZ HʶgCmfk9f`99mk,gZϮ=L>D봂V:^VOnіO9>Z2f>?.eqxLJx#9P$BV9{t!4]:;HLjiġ6e@plbP%U d>atA[DV )yH|Sj<*/7{o.+$O(JPrezl԰? e|V ёBEc RfJtD9K\u"H id!y7;rޭ/`H--Kvt4V(Ƨm)ˡ[Rʍv1rW6ik3iIetR^DQD @fٮ BK)u*jZG @O jUd^ &@!g%'%@ b< "X{LCHRNSz"H;;X% -&Q*GL-+(6CNIq#FtJEkDصXXvŔSj 67B7u!̡*fS? P93g8HD p=dZ6I Zbei7S%k9E0oa_?6]Sk G YM2h|‡+IkaM <|z-Ii9򷣰߸m 8nJ;48dҁnZZ849)#ƈ=4V5!][͆K׈QkJz>8B :  5egÓs#{ xxɎň[TMtFxHHϽ2fLJ΀l9+W[CW-C5>: i N;ԭȲ:iѲֹT>Q6tu{s ZY]H٩I.̒*YUBYW0_`pF_ϔR2SjVfJͺ3/)mP!6 _ud$Jф ڔW ]hOiN4[S6We_מaZ8Eh66Zx9?5!7ܓU l15ۯ[S\tlMٵꊐXt+@Aupc1yBbc &'jlF=1x̱6lغvy afװ˒*TQmh[kإ]7vauȭSf#{N)7ʩM2OkZ}$ؘ3s#@b`W5H HϺ `o=QatBmkB -2>@:FBe~c w|BC{caD6eH~$4dc8UÜm1aΛKL(Ak{mŮaί."dHau0|8 J~KxE$ru77dܙg7Re66eY*]Y\d% 42*˯tOCuF(p]Las0lgBI0FB~gdMD%H6Uʑ 7Sa[di`S~ }c]25&crhϨ_~ci`^`˫RԘ\m}*]Y^m1'*)!j?mMV/r|9:j豛vFTL;o$XH +yvմ.P^_.lJ)˦10۷6|Zj4?,mVfiq]dIh$,u⡃5.yo+N(S GoU)lE xc+\hע m- x%uZL) &sѴI^pƶQUJ&*MAm B4lVB`I;gI-$9MH6Ĉ WpglJ9U >cEL)M?S TBuH6jd0r³M5٪6 eq) ) ?hsm^gКAkJ)ACѡ9e%,_٪ WjX뾤ch|Ad4a[ R`hoP2M5őۿBz]R}gFɎcm7D 2~=500 LIELOWwVV9YYs6 uFLBFj`wٳm $b3FY/deM\kKp;͌ W&AJ5F06ܮޚdê|E|0 p `q:5fŃS*yrUj Ĩ"8+-(gLo.0 jq{IzD{D{P<5t,&9VGM[nDuoG/r9feZDPACm9  R`s@#[Ay}2/ngLͲW#ߠ%i;34zSZN #g8>eXyJy%de"yJ7'[VT ݟWDn QK n>^3P=a 7gl1D&t =x$%R@AXEKoqrq [oݗYRn)EQLv1XR?Wfj JF`y- %6@(pJQJR-@_RqT} `W5CR?h(jN*fUƵRbbǒ/dY؏w6ZnP zޠ@,7(YLv1+(9?_VwMy.Ӈ|Beծc PMtxP`gs3 @ k=qB1zƚ:Iף?(~#D‹:G}>F"^JXa"h#+sG5>V/)q.89q~'A)Cnj35nbl1.:~ Ȯy&ݳOnL'V SL'hb氐֘9Mnvk웑J VG:ڼ50}B"]\=|F̯ΰ]t#&'~Kh3X~d~31vC4vM{qqM>|ۃ7O3FRE; P]\[ ̓ӭn?_~3 aFy7՞eKqb}#̟iXc NIp3Ck~e^m\IRwEX;_ 0^ݏ9bvvƸ~l_!ӻ}i)ֶ$?W^7Ӻ݁Q2"AlVjv&rmdN&aSUQ9֘Z=Y-iJUS m]2vQK%j_bA<똸>WW۷kKj@0)pHr#WJ?RJjS`~0~8y^2DUuEr S;Չj7^|q w[ .bc$! P݌v[hj9zs7a=3O5L/UOeqU<3]NGXO5^œ)ArU kdLΊ܏F'ceŜk$f+./O +G (|ȱWF)9ݏ.dڬT,혧LiGm1$mvȠVOX*ֻog0.n^2ҴInmMD֏^$рjkB@3LMJw|xO(A7F mұ[r0?k(44hyCo{/p˵LeQW ݴY|YQ`,ʼn˯֏1]RZnnsf5C]p%_f'b9=-elGOaqA|r'|>d8=Qq-'_f\Fjng0or*~oSRų>]0>~>^*{o习lxWW율P ${bګʈ ,R fYA=..k8TE@dd&cckFRtI uK 5vQXcZũ\̿\ Gdqze{/ -c+Y[K1#2~,rE\p.RD92W l-K?/{9#*t>{p7G(OU'B0BCNS| e|NEKdD4}:bA*:&6$=)SlgMl`߬wZIKgXKM>)5gbPlAyk9TnF7/X{ۏZi'̩H,?SFYکjnTE+UGÛM.RRՠjqbTq(Wdcdsn QXrJI0KTtdvVZSѩ|2Uf I CyL@&7MЌ6}km"DVTEt.< BЙ9BpEPBBV SKNd* SL.Gi#$)QXʤoT,[TS8.nMxwzv+:"d2V͊N꘱#)v /rXn3{ B)UZebeP4FG3{v ^Eп_T|ENS`y?o[_PX s -*X2R;vQic~AUv 7;|OF5 (hLqL(,]V6_DIl=S5bq2GDmTK16:=~[\Vú=~ /1~{b{Q*X֎sUӣÛC7?Xs]&uu:l^$}){lΙ939A]{4gFEO[&[MuTl;͊9 s`+碰~Dr@캴R\ ш[/!T?|~!)HSgb ȮDcOuht2kQ1QE1 3 )R2\:D.zFu=\hr,!dQt7 _ӎ7gSlF %fYra" 7W7b]b-e+DpƗUa72!{fg$'Vr%E#=qi#Wj^cV."`vhU(YSbTSH:Yq3y,Ϊku+*GV0A8.}[jiL*E1 ?Sen72Ksaxʴ2m'M O 77^;J؄ ? &Q+S(#uJ~rj 8&u+% E_5!({gG;,R*>PQrt>|? ie$]jRkFZ3Ԛ.sdi_%L=btj GFW7sW±Vf%^'Y%gjd0lz<9 ڊSS^tH:\ͨQ['V CLý`J&k.kFR}dʂ>%AZs=Ҩ -6JRuϋb~KYwi2{0~q2Y51fpq*MAMzAQtiR j#NyԅEG>(+e|,Q-n`Xej| ]j 2fyṵTVyʍAT/س8TZG>bq`[uݩp͂Er5e?⣚r˽xgݤsa^ZKif7;{LF7r덼Iw<? 4ܾL$53goT~wWb\RVX(jaͪH=Ġ6Ck^^2XJQbPm)&"ŧt] r{ï|-rR$R1E=,Fb9_xo֝~;O?PXCIkYzn^9oіsAm>~٭*i8Vl it3RTsB`JZcǨG<fhV8i5 SOm#-rAiSQ/;ҮtLkk܈~-i,!Ѻ!gMe GiG'LJ${:,bOx$,zEU(*&rF%L$6%v'XK!?]7uR!jU؜aMiRi ‡u1'ꔛNc1po^~XJ@\g劶zcT`ߝWЮLk%TYdSV]W1.)b/jȾFmkB!QKf ,q*[?-}yCplCT ,Ytda8%Fkmn俉ٻ$콯l^}mjFo- &uL焹>+!;31(iBAt%T*0ƴPsqp[T3#Ooۅ [mb_?/pW zH5F}8*ZU,(^ _ }oj_)8_F:hcME ў,}o.hLvJ k;܎;1ۻ7 *U |p5rFn'[eM368xdJQVB\}ob$ٓpn_"b~S /G7c굀bcjCN`j@C8n{\1\Qgidh?X1 F'kuv s W:w N䠢 &J?g#Ά(GLs_)ם{U`z^qr2}Sxy /]&ǯmb#/zPy7eyTy \Χ>u:7zC /~?ӕ?ه߾Iگ|%|OkfNjqhjxѪ9r@ei煐Www!} Mc &^r!H\1;fBWwLRM 7"Dvh)8vm_1>&DV;18Vj]9ٔ^܄ł#iv['ivz3v;roRմ[MGD Նցyv~/˛~׿4[n0s975wc ={oj~'Y^Sv2*U L|qQkjNɷ8z Q=(?Z(g k9j>&aǬwol̏՝h:89ųԭF:ܽipwݝߦpMCG)lmf ` -2ؙKcq}ē>xԣ9 y\˜L}y(kĎ^W{Tጇ S+|+'/? ]@{/5_'2Ev߉i!WS!'0Ƙs.iuSݔ|CTt&swXJŀUʆ5ٚ*R}5WR\)4 T*.YEPzc-cdjk`1(yD>z Wȝ:p.Uݕ1PALX h%D J-щx736}Gd!>mڕ406J^#:kF_ -Ђ/Ԁ%!. 2¬շlcokh.!atёno 7c2aҠuW"DW:Ŕ ?xs?v lp&,FO49to*KDŽsJ-{be6;a3ad]..,Ɋ$6ߨv\TX >eyzߒLkʈjwAg5j)U7^ɉG`o1fNDHz|-' y*Z) 8PUecGӹ4vh< ]ϊ3迟펁b (JJ+(+IYU $‰f$[vRU6)Uo(zѡJ+K䈜zz*E>7H3+6_jTes7]hbF{3ڻ.Z ^gVx*3G\#4t=:D7f.i?.=6YBO=Sٓ vAi̬_lwHT|ΨzH` bj&#LchSeb/nZ N:np3qmA<=vi`j,-O]*{[2mSn雌t~Ըxh1&,y2\46 {*{_IJUs.=\I_e. Z )1 il`l'5]Ur1 mR'P(3eKl.ڬMT|N?['nzQm/M (-0<*9DƫPEvj0$!-> W|4,ZO g[͞9Rh8ۙ_nN+j~blnT._ <Bf0j HٱE`4 9bAmR[mҒ1/^cZ/ؓX7J_#:X5OkTöG\1z5ȿg[p!U񤴝el-iC2Fn${[fGorkw$<BɻbPfy˶X8 O_kd3 62pQ)K MajȦب8G#Jo!I,,H_EOA)_UC6~c;Ċ뷟ag5x|2j΄a'ZcUvP sO1>ňj%v!V[uJC)[Uk.!^ܽ=N׽PrU:G{Bݹy.鳟;<8s4GgX3N׌9Yשf8[7cmя HhREEe\qϣ&{;:*OKȍҨ͎ݶ]&z0"=rVwFܞ"%F h4`S9LvqF!Ff(qٓ+ <ڷE,]vfP Ɉl{qjp\ْQbŪ&֫7C.FzTҫۋ>׫mW*-t ~4(ǘ;/pCX |\ sMF(ѾB<")XȞ21gWۨ>o10`JфF76DS0z{}e)Dq Rj}Oe$Z=b㆛Tp_R2JBWl.~O"Qwnڳ!g\1=_O_'ԪĄI~ Bwi;iБľ'bUk&$h.e`#qNǍ+9\[K=tW0Εw+itbʃTR2fq=72sx6G7w)N0Sਠ;V5W!)sɁO>w"\| sQa8'ܘUIψ%\(+{p$hGcVY||V܊Vd[#Vdkh*"GRێ"aEv+o2C"فcg۱ǠYj:ev{Dѭ@;=nvŜ(Ü9 !ڭn^SVh桝o47D2/Z+zB}dڽh3n*9)S[b.(X+!ߘ(;Ccge VhWy֙931eLLY4(>֭31C;yF#={*8E3o7N^ͣ $on1"?$J_G3͇;>PjUP1([ϖL-=`$EPi "Bv硏A_%RBp*\2!@ d{k"  s1UU@18dХ*@N1*B 0/1W,7 p|RU3;T#+AZvl<ƬLe~̐Z^k (6*)Yb5Tဪf]"95`MnR%7L6ս (궠S؀LEiO)4=PuP\*F _@UTQ#Уuqɑ_.?aU] ?mq\mo\+Z#15!bj%ocnBŁoJTeUg̞-QIuZ-]DDizs$xj,B!Hq?쭽ffKs.8B$'eǐ*oN,aPbE@:KṮOF-0b2Y5Q%?_FjΘR>9ˡĆDa7JN?2EOh7 ٛӀз}ΐ5ǀ́}ݪS6 )9LRy3D@ЗC]^~'MFو ʹJltOgq-QT}/籣mqn^GwE26Ҿm1lP{x恌d, .|s\.v>wѢw:\@|`[JΰwmH_彻ksH" M0]nF}6WVJ+RD cK$gȇ 3VDKlW\bLr5h5p]%v@.1dtَX*yH.1]VY7<\ eN_eqpAa 4f'٭Sl+i{\ihZ6d=͇R?#&E,Z4Ad%bdGjTs/}y 4[LG1s$O'/|ˡOIɕMnGCX]sfٮSVMnϩQ>$jG,泬0Od*JӠ 7\hyJ7sT<^k{<aZ*h3q -C'6x+)aRqRƒr)\ar#IhgBH͝(Q 愌 ER%j]*Q=vAk<$j'hJ̳LXkQ%jr#&]jGB$jw$I2-gpOvڵDڑD$XhKGg5:U0>4jgFlW8{Xǚ'݁9o9ˍICV7D Aj*.ZCvyV L $jR̦BBH¦{0H*Z#f՜]bvUbvhh[4W,;͉=}Ljvّ:RH"#v"01NWq8?H;M /U5 %f Xbv;bF'fw IE rU1Pvf8Kڑ%#̓Rƪ-m2%Oh,OcwI<]sZ9adȔ+r “jCQdf(TrL0j8}Ŝ>y}Udi=K|axzUnJzZ(+{}\ !DɌX9:979, ZƢ$Yp 5 <9(#D%bPnd2ddp&0!x,U%˗fӡU|z:&8z#0P̟fܣq/ j13'-(omS3-H++0zࢊ$6`3y_C8E"N/ZW7c7L\4+&z*kx2&y.)FRs4̏o^~ë6ľXն3v4TaNNY- 6,Ɏn7aS}^H!Y/yDZ}&S]}|9P(DXrSO5#_ע)BاXl*Q#DB{19VdR8 ͉zríqy-icxKr뺴ӊu6~mOyٶ@1EWןb4馨*بuE#~tM;Ɔx+k(YI:lm2c}sU*zXpQy԰ztxM#V1r:@7OVͶ-}XNʭޅ%ń[mj0!"m7Da=m)c]҈Zc=Ri,\>, бp|@1-(ρBŇwC [_W;gl޾\s8R IDSJvk|uѩVvHVPfL G_ww V2Й,}9 xoRlI Yw'pD:)gjU$ЏQUh]rJWj:&4;*3r ߵzv4!ԯfT7UO;y g1h ~Ιvi5) GY] c8ܝ9C':{MƊԮAݸlg_!7uŠأ_՘OBM*Dͨ{=[J弫 f2:JdUwwWˎ`x\ok|rݧ б.Hsߦq611 l)Xk80C~Gg9MylzdslT0uev0z- ;y)Oy@i~x+W)~I msx̋tΜreSi+5<ѹl򷬪TMr xd:0L4*+ٻ}(٣'pEw 4ҾN4 HQ+$v8n|}7ñ,r7 OcuL}FD&K-w,VAJ\qg&+#koέK}p6yΞe'n8o*Ch4(9muعdՇK>{ 篨-aW<Fub7~6R?qT_4U63 ~M~:WB \~bH0F܌C߀_󙱛^uw}r&k|Bj(T52Ӳ k+oŸN``BJlUeĠFhrV}ӺP@"/ 9:5,Rt.uJgm?xk.}یw8?bpWJ-QK;_PmbY t;ng@ [!ՒP:n-!BN@ ;nxA1Clm ܷgTK)[a6+ivf6ugQwugQwt5g35(%je4t9ͩBNk[8hJ@8GtQy6mlڳF|T ^yHv~+W*@ѮW֑ۈ Ce I"3N̕CigWr v ,d-sEK$ OAlI;x+ܜ{beQO9svG_GK$eN3N6_),: 1WioCZI9ʲKB2SxK$&'d.T)$EFFe3} UAPbT^І,0—;-- ֲ8F.>ɹxJ! F5q ^Rˀ%a6֪9erC+JIX ro3 r/wA4L#f6.M#KB`ܓƤȡ2{n%XzSr{eqFk|#GrwucӨ4{M#^p7zZY]rF/81.],ZWp]h/!J J ̆V`kݙIv5{)Å=Gd 2O+[)8*&m!5p"0Px!V$/WJe.w z/W=i&O)=4t e;<%OkFnU"S`K9NڗuߗꋭMZ}9-WbXJ9 ذ\JyeP`󃁵;p WJJr}.U\,Gacܚ"ttt7 $A^9O8]/ ߀VN&x6ঘK(\˛F{s.#+9gL.. vV_N@h?!;%eREi \vk-1ӏI;V|[p Yp?fCeZ=UfO) pո3ì@F^i@\ZqHOR9y幆xwշ/:mTPwD!|Wǻ 1l,%+ƻ Zq?̼H@sSur (M|/Esu_=mT!2f٦NjBTEYy&0DI,G>:WK# L)Brl (w cTo@Y&zu> EXknڋI*aD}^{`HGa ԃxiFAl=^@j=m^:8 h+I4A3ʭOLBk.*R"*2SCN s 2" fX&y JFj@b^%S GoRMB}:JC&AWS}G)߀8#ZVyg:a y2GeCQص&i<q&_[g4\Fu7.f캺7xtp$ʹқq rP/hNVQYC_R_3K2b~Hf-ZwNA xν^6IXvp"P? ?]%qt!AUȟTI>}fY0@pn4ЈbH+-ǰZѫ WDq/#nR|dHA3F/]tqawW~Q% _Vuˮ%X~By'Z3"u',Qc,ɼm,hJIpuOL2)-32W0h!,( V* &U%Ha^zޗ1D#7bj'E'M~n9d,^q.(i˖#YfSFE] 2-`|@GZ69+al7 c J4GSwњS=~|Y(1C$P t-yṙ2{) P?_MS>a!3Au^6!,= ռc+eUż$6tA5S(RѡT)qݣC{l#ʎwQ3D0<B/դs\ FT|O@LkEhx Zmy0Wi@tyx}HmHtA-:2 */;ۃk2$[|ܖW1- ^YgFx0#A<[NDB|8߭/5RW_3wr ~V25+*1tU}|o,jӁ "%nrww~w7w˿H_o3~O&HP/.fk?=&%>ҵFQ\7\}Im't-l<7f 允u@lArQ0yrVRC%9ẽ  ! 1/o=9Fc j _]VD떧G]k%?̓-qeH2JzEE2G?JE)\Tըʙyp]Y-[w{;-gJ }4$A+| Fw}n?~,u_:ݼ dPR+Fb^ղzD Kr@" T\xFo VUf@eW"+Jd5͝/0|b^D2Xq4 PkQ0 Z`y1d@e'R{{%/K+?Dl)Y@i OUΦDEL;K 1' 9xIr+9}ps0)h7c#;D1z sղ]={q I07&/xd}KgVqlЫ_~KOrG7љXtwŦG\sUՅxd2A\WÒF@ Q]kT !f&qs:raHaga+d!+%0 Bb!NrٍrIڥ/w6c*I6T*:w6\NuaiAVjFH*!\a?)Ќxy)JRC % .q,59:yi<y%@x LR~OKN32LHM80-M5vDpCxt6L9D򴧨! 6\)tMA*$K" >9^9nv{dpkaQ#lPr?UI#tœC蛻ټ4av) ƒW P.C y%<=WM%'k|~3.?>̿̓z??d¨ e\qӋy"w?Zfeoa4ٽϊ{,r,߮ j/g!. V$*}k[ڙ8+^s}.U U52iYx[Spձը޺Y_pٿR :cs]vmF>cjU̘NJX=my=M6oP~ܖ -J577k gU*gT"Ϥ.t2'sgxk-#R,. >9?Ogw^>{nW5 7B7uA2.#N7I|Ż0.knQdW{sJ2QfB1!J$eihj[!y-iNBRRX#s-im-)af ^.eu!$hB %%?d`6W5ɬDٯYu 10i2'τEKϲBH",;6VKFIŃCQB#A 5M`]2,,=噯w^&##2d'5 bonE,2NglBJ.H2Mt6J) Aa΅/KHʰPrg11َy44㽦KRߺsa%:BVSaǂ317х0XL!= IӠ_FTZ Hb J>lpKVv넎8H`,fQ%A~Jʤ $[1W>(%<{*N9ֻ>'AJ%%UBVHwB^N%Ì|*X(Df2*ז)c RT$w#b]6Y8֢+DB8P+rR9 C"Jt1ht% rs~W11[>CwjHW &,ɻ{G =K~,hf<1Y̚m~ݘVЍ{<CM5(t?xK@ +*:=Lݯ50-΀V1*F} s+E53`???yT@QQ5\+f[}'VwⶱoQ׶zuGj,V\BhGjf{>\}$saxDXyY|QPzlK% QI@`.  cyzaSQThgrM$V|V{69bb nGOĤHyppw38Vq""#3Qr7cj4D!܌$$ҺU|WE:y 1@h~'}mkˍ,7RTېRq&RXta# :l8tv`hRpmӃ^u^VΩ{ג%{`) 1P^f<ոi0OC@t)Y]k?X(ܺ yeO독WY8e&W:1E+֞#*VVphKJqoo]0^TRړ C'53hL<9ܵ!kx*sp-HU +RV7\zāAj2raXqن k5w.*Hj+) Tˎ5u[>PT&7w>%JCS"U19pT`n LU"Z_l!`=Uqˀcg'cDg|BW)OLcdc:q7q5l|MO?.L95h4~+ 1tb^nmfFyٻ߶-pה-IzEphѫ (YIQeEK䈜3gkCۗco4_d/zi"˓U09'qcI%wqNâ,v,S5ی}q۲kh|4FѼB./$.ghtLR]? ~3NU|KbI$>͛ l }9񙊰1ØFCI*`WAWyZC/@4m_˩RD xJ[S7Jo6|SG,Smd6_ D/ |r!>Ϧ25K(|O 31]pq^b1\o}je&_g, \b_c-ۖYF|e[2Pc,0~MV S-AsnyZLdaEx7vt";CPdMayx ߊ s}db ׿$ RYc0Ιx#,L*Bx* uVeW-#WAJN'ڇ^Z( DSi [wjZҎ2{9YKx"KܓN_ Lv޽W6*%yؼ`} j}8 :!,6~aqjjd'ˊ䃮>=\W]=jP2KO Uca;A@?Pv6G .,zBhq$D+̄!H0\=R-<ןj=30$!cP3`_0u(Q&~:0ä|Qß1OS~Y H#tO%EN^GV$fEZi*ZdEZF-rE~THBNbZ <&#AE YgV\B眅XYV Sq ]qN3DY uzITI}j̤Ca,YsysM}lYıP3)(_UAt*pMZ*!iE*l~3JaNՂJ*М0֢j-akQUkQgQzM5( 1,9@QD51'`SZjm;[S*U]jeWV2JZ a?y̙f  J{v"=e `au0‚anY3W?sEEI21 uR1 y,3!Ra( S O,b$FM.8{{/W\wE=}V_ctG$r4{xi6˷godZ=?11ceиgZ5 K@a"m2T&4F4ï26}Q#my N{(~jv5fՔP0nV9si\ArAC!'ZA"".H &8b!)s+k BleD:ƼkF˷TTjS9]*鯠D/]?[W^Ec0h=WvypUp) Pԛ.2~UrT7"w_sٿ XU>5Qc|#̴5pU )\GYBE!|3 Pi6VPb "e#G YƵYVUpmXuŨgrQ$hCz0R(zfgaTɮUΑ6t@ 6"->*O7ăՁ450U0THX+4Hu461B/XꄃA.RjA%ViSZ]%+eK^]dKd"ulT%aP>#Y e0n} d5XJVfxI.5H2i\'RXh,0 16DVPPs1LH s*l`E(n9k1bCXbbHXRApdd2 &4ɵ\)wUxOV؁wzd?7TA|=u]5Ybqcjh8 m8^/>8#K?vO/,Bk55t@#ZgС6( c(!ZDeX3XyUYʈ`Ff"";zq^nWgp]Jݒ F#(hܥGY5dN?Wh1adL/l>KDIbsPYXB.O1fK"+aw[Jҷ1 žE>(@l#HA 0WOK!Vh# hftuH =E~?IH+vgLrhg[a\n;Css \[߄42`ڋ.:%3T~5T TL2φw {0vz*D谙H ɤ&sPhg–f zLZV gJb-V5akX lq3%<nVV0;b6k|8"- B s@aso?FUbIbλQu,32r P;'@B"v~5a;.]6Ŝ|902=jSP}/tw%XvQb_ؠ^|yQԌF'fhM:|Ml|9u.l^퇣֎X (c䶦fyWvי'w sP]Yoe?m P(5I@'aW3~<>aW MCμEk .U#hbr,ze.ڹ3STPt톋j }`F.d2_<[TufH #)2~X?k"X{N,\=)`/GTU:ϒI'՚58>=;a%< [BW/C7M[,yςByGYokrԎ)K<'eo}HԵq앯 5R#5M$GC DOT\Ϳ$jӃ*G 8Z.Pl ^4MY t6iЂlISHҚ%jU\ޟTĐPiDRgz8H׎?rV缧te='S?$<*N% IO;eI8"D-BU"'|RPzq{Ë0'q1hԞ"1p&$4@=Zp`13NfP ^d53 aARV*֭2:*>b-7s+rWb2*Q^R|хOlIJ2o\uˮ|zjqK1F u0Ǒ_rϻӢƉM퇟@ΤEόFtT.IUo'WO'j=CxsRw|N>V/_Mm}es^\}QߑأiQ2151Vj~L=X~u5jn5_I|7YB}BySmC9UfҸF덗%gߤcs^zP;6HB3# dx9\%//K%iqen-}{oF-o[]H:JK3G]dZPs_o-KㅅNLLw#;u0R3 (.B*?ӣ }C] Q!,.pvΚu9/0 J<GT'|%.!d:a%L^X F*jɜ(6:#[p{$l 6|JՇ( }|[OQO{ ^bچXx Xe ,u`! J \4̞;e3ӴA WVX>UKJlRiTU!(ݜS vGs¼j64 7nO"^ uMƫ/OOﮰUQoԌ3Ι@+N{{~)2zx{2駩&-͉`OG0>K;>὿=?'1CU|ś?q 3 ~29w7ئUi4lE&DzJ>Պ?^^ғU-O[J=*qCg.YLk@^Ш gw^uK;s}S}4%90rp%L! )xF`% (S |id.[L}-P.)@~3%"Ϛ>,GOi47:u)  `%ҩaHh !DcYۊ P Ca)dBRB-u7e2q{60zM' ޕ_pM]$a1US”o!&F¥H'&PYI R T5HA6q VrϿ:u[!Z_f [7#î6(!M`|B+"IHIO5auV;1kx dsq f9 )MD7 Si!3EH!Ỳ4ģQgGRshZ6 \ԦiCZY6-\p!1푂!9}`&/ gVjVa鷗A'_Ǧ$ [hܢvoij-QHhَi J%LǖguXSD}%{NudG9NPRF,GqE(9zc2%w f[Dk\{t 2KRA/u,{"2Ct-:wھCKj]JڣZ8ṇtI *tV^aKۋV3y(Q<SJ^xzY!K ֚QJ|rB3Q*>:b׷s!}I85T[$!Ԋ_\:%( ب8o,rN, 8nSA"rH,^V2Sc>qy1n4q;?93cqmj? ៘Ʉ5b=2\QJfGiCP6JQd-8mybZ0{H !+خ{ s$"7AGxSr%g{ő)6{34[ΙAjr2ti Ң(tV;޶= m6=6C^X9V|K!,-VÀ{B!HlK1f:j&a]ɖRNX &7)Z;k۞iy.z 0Q!^ѐ٘w@C4ORa&+M33IIKx2J-R`ƒhvQ= EXM!홆"CLr#:nW< GG/o!Ѣx$YmѲcdZDqWJ$ŏYXLi+w&ݗ*A0@bޥ-H{O.PCwzfIPɴS@w62-' gM) 4ah+=.TG1qӟ'eUu:IɇÚVuYUbcUO1ܞ/ޜ#Ivq}xgA o^ʍi 1,&U''5#wRŞ]NMG2S\rFu^krēIx /kd9VT& jF0,ވoFf}Vn?dC 5PD0TeDك ;ʍF3FyHU|3(:SZ|jCwBVzY*%$Dqڝ)+yN)Y2-X=h90jHn5^$]EP7y{.^I!O@+쁭 p~(+sxP+ԓti^+LVz؍1Q5Dm(0Tzzcc%Z*̓!fFYpm 9(/Lw @qy; Oh8 5L-}V36/sZ˄^s{[>:[o{LϦW_dcI7bwzvJсõRzJ@f 3X*T }֋ڌO:nݧ>O.5O#ӧ, ޥ-0Am$]ߺ|\>@)s} @uNj{WR>i.!2yy&M{:@=y&C7󨄙L&֦)Vپf픈@ժؼu2׶uˋe(U!d#3Qɘ&XkYhw_R-bl쵧5k C9!\{6|6ϓG=۫ϗ\bqE/:`Dl _5&uy|wճ_.,gO>^F{?=ۗD2Dе8_3lesKH+%<bvfƈ`'X1cQ}ݢE6 (QdU}uU3с^F;ŷ={PMõߺ6Tʙ\^JD`NQH0Ke_efOf[֗~A.Vl?^^D{۝n9ϩD;ߩ P47ӔmQlwiN"7:Vb^1|0Z q ̩f릟'ۍ'Z.hI[S͝v;Gq5CMw@s+>ƛ#Ef7!Vi=ayK1m-ŴOrIj;b4aٚBRVfqg~7Xi@'#X,Vf,11xC&WuXy7t"=eX{J|^>TKF|MGJ;z|/=4{|Ƨۤ^ ʞUirMt|WNߺtOo}Z[f^߻ 1#kN.,el ka)笲*× +OpdcPO.Kբ#[Z4xL+<_Ũa,P3"'X"M U*f՜R<| #eKmO-%pzB Jظfζ?i]U}>Y oǘE pp2zVPP}B/^Q Ro3_>RL@1hB7(;w,kJn+`_^УYr=Bw ڴa,DHD0^6/ 8/|0֚'UtoLOjmj5Hߔ^-Ne]VYLrD#<%1#L+|VL~~!It//&y28*?Wx&h-]Y*KebLP6;‐!|\nh7\n$I B%20.9=^%|#i 41qP1Tݬ]7\: mswl;!+T?ܵB?`jЃ@q~P@բ/7Hq )%),Iك|/Դ< A9P!vłCr؃a#rI{GhHի~_J6"az )!g2}•?͵!WsMzb*INlj\ǥi62I36I O4E!q7hˬVf :4{\Ff]!1l?Nf!Ҵ<] Ax'=S8Fkwl@SD]oDz#wh%PXk(^ڊC꭛O/¯twNJ%;f*Nvgk~Lwm% n$|S"c2T"S;O'!EQu_8&GbP8[KLxUΨVX"++L:nxR,2ףN^U;&ǣof3'?h<*{#AJ[\(Uwˢh&t{| @xꪄ dE/>,ҵ>{_8iJ=+~% bq9&\4,pX o9Tj~wQyӳ(Nt#nN(a'hQ]o\{ <)WoZ:$Mc)k>_pr#~˫)wp;JGrK#8L{L-ǤU 4Z6L6T8 eFSV* {2b^nlPtXuzt^*Z^|1{XŸ#o-|jTsS|ɝڜ};& SkI<7ebL*Njٙ-}>=*JIG_W7S{Yms0^JGf_/yT |\qW}'JńJճSӉ?i}^߻V |Pc`^+fU2 03fvtOԭ "+ZԽZMtc-S"v".;^ϫVkL^]FWز#v2MN]]e>OϖGݸ:yZ|3My?"p@a':kAVOYǣg]wo_; />*V7EC>*.Xp#-j7cKW/*Qʨ&3T1OoO/[qB]Y`MooR&ocdL2<:I#TyDN|1%2//1yI|[kkܣɲL!|~C *bzwB 3]Jy͝DZA AJsN# 3QX}+ =%<[/$9qsdb9:A@I%"|!BY&1@ʛ i6 #HEF:òKVCYyx$gQWqH.-}eɕ4Ijr7d)%*,Em8 }teTlkVX!jHj1p)bGH2!S#u;˝RB[EWU(3Wdc1K2In3ZP&Lvt8]EqKiHwOp)LA>p D1CfX)E`]5 k 4gtz(ORl`Y#ȸ$MH5ՙ`p|/&(m2+yAC1Aܥ[ !9 F˜ϖ*s8$:J&\=4?|UDRFT1Eˤ(F#舨Z;*ڻF0Úy0b 2ǁ.b{.(ҽIX'G.ރR'hԲ=Y=JgI`ZDKF/:e˜-\"$[$E;.fj0`bh1>TU"(tp?F`34AW)0ϔ _jUpH'$'КM($V"|EF-]J!vfoQ4-)^Za"xȧ-ڝp&Y\rSd0֌g|M&jR2ma aGhݵeހ0tcbҖ8>j$Wђ(*Q렒!)è2JD;˸Q4; @i|#=ZA rXT` 9D R' l¸@ vYǷ: eX# hUS")ΑEd+ 2F V$p33D(#gV<5F2a ׃RY8S x(b%nf# =! }tPmV&ܪ-qP`PoPpV\D-X>TRTD1&PIOY@caC1s4 MQO8"D X\זnzHb3Z Y ʠTc$ "2gPc$+m츲h &cDRrk#3ډ8nN>L:As#ER-=nknWu{1 aHaL)2Yt=H]2ஆ* +WVz=,ӠMB!EqlPZjL\s*+ +vEa#p! ) xhL7HCY"26֛@-4&Ďf ͉) jU7 \<bS< 5( 5lkLj )T*ȧ(FJE5WXQA5K!yx)`6nCSHPSJ7;$b@Dz^qJ僱I\l4lK`:U %i s/D䳝vZ )l,@ gs@VAjku@[g7^[tsiZLKQDiFIQD@u3C;nDd3ܖΟ#wu2W)KbK*!x6Fx2= RbPnB%\ \˅ @9Bi5<R{pU$\`tB0v,p d!5qnɮ$>aAfPh" R%2GX0ar %Mut")5Z;tGXɳjꬌixE";J"5ee36HT 5O Qy/[P$d@0U}i%W *Lr).mDuG,g:O'dx|x~q󄟸kD@E#7@7 wf= kppSQ`- l) >;Q ƬT-G(QZthvfc ʘ@^[o9(-P9p7%<%*` rhSbH|(GЍ$AU.j:R ;( `@R9$TiOW,GX7lc2l8l '3YQWT!& <*@j{!At׻ //0P(Ŵ(c)U2QLFB,W=gUp?~DOF$ AT80v*`c#5cVҠU }K@KUyͤ:IL\@0 >Y tSt_VKa|+MxAfA%V$NZup$k +x-?WCc߿XGﺇ ~wO=,s)vp<9یc\^ڗ!:{2{em^Y*˪gm,Ze9ucW]naT὾ƻ[vMq=x/=>.t2S687l\plC߶n&ú oZᶻ چ/u!p=mW m!}mnoȖ5_&o]o|zg-z߽bjᛕ9k-:YG1 ?Z%g.A}ZZ4dW*SQ$`c4Lj{2+{HèE<$Cg㰕Ks (!/a[X0T~11R'4 i[c h&׶[ wlHi}1Yݻ?<0n}t7}iyP5ZUଛԺ}Cld%คlr$S@6"fQmHsY/aҒqޔx},DgɒUMHdZIRt0+I%&#J}zC #ojO}ӆ#jC&(G}[; AlOٶi#-<$t&y_NOvz=Øhu9_YaĘ>@M޽rf6M?;0sc/>%*hl؍t{|Xsƫ ^7x]_s^egfYk +L]-G^Տ=}u0}x|+w2/T+ZfiA د:E)MoS+擸|w}[5v; ךvo\dNaeOڭ{ʹLs'(G;0xGWJ*^3en4|.rvI=o3_y1A̶Iu~klc4=Z:?MݝwlЫ=umd_8~A!} &yDe]Bv7-^|f=oOe&xE=&eZw>yKx޽kv70DžW󚏖;t8ś]!xWu:5/4MN4U{5dz:0ozmqY߿9;9?;9:#b̊5]L/C.VNJY)Aޤv6'V9RVhg57X.NA7|m|.ư+ҽM=?H ,2z'3H2 aCe#R{2W<f$ɧyR[zR4Z=K&q"E~k2kfozz,EB˄h jRn45"4%%@KO(Q \NATjAgidbH;Fֵ>0ċ1GO rm*5 CYDP \)ܑ7" m^=,,hnu""A.Q{ᷬ ^DNB Z T4ƷfL.Ii_GE5~J.wGM BHq1!(/1J OCG'RO{@ˆ]ˡ\&gx\ \HЀT [@C{dUfDM ksHOQQs@4H:x@*ِdK {xŰ*I|~!k0t14Ʀ@sJ =Eo` ]/MA $ʷB!@ݐ)A`PʳU7mSqW%Br/# :fu-v:sTB\1L `&Q22|$ ,2SC,^e;)!Ɇ,shU"Q\:ayiHpQcؤLY"R;xbF θ yE0pAL!,LV)݀#JU5J$XqH]Qׅ djaΗĩmR Fx pȯaC9 U;V װ"2I$,ld}Q!V3eD4GPF; VqS/Y8o+Exí_~ v"f;Dbk+%rʺ@JMF$f}9VQc%%(E0!C -=l $}e WE7)A0^W])a48A~ 3m}#eiٷ*$Ga:Q<1`V(L7`_dOtdڃLKI"Y lxLF `E3.QHJTA"&䈈iY! Z@`rQR`0qbZOmP_ @q+"q/=YɂN~D} DU;͊~f[Mv2,8XAOOd{sM("f-'G ށ]re҂^ @n@.p_GW1*Gey (( >!4Jr ю2ƈ;fET 1W5LK:vUDڜ׃rF4F,0ZnkT9^pE L%0mLgP'2,[KD)e8bAF9|,2aѲ25rN;UdI'R% fc͉.?pV,YЎDJ ׀,сlǫFjWrVk?άE,z0390 "}׌oNr*BKZiGVVI/<raJL@Yƒk WP7]5jf=00`m Xt uZCT0碦77$ӷEj_ ߝZm}!w9oSb_90޾컋ෛכJqξT*l\K۲?{Ot%ߔUY̿) ey#):LC+-ܞmko\Ot j{0Q8߅e=~Q_S v γ_s!v r#3P7B]䂡^҃[Qr DS'<)8ZwG{[G~K6o$ IO01~ò5vf_g%4W-E94|nk7w4K%*UAT (JhxHJ"◍y(hH}Usmo-2m 'c);FNJaE[g/_4FޫoUJ]z_7?!"k1$8Y}YJE&XF:n {bV_͕JՄ#s lS|3ڇXY !DƢbR_yvsTX2V obh3.&Eڙ<g\R?ɯ4+5.m",^^--kb6]N&]aJէ/=:=,Xygp.U5 VbEY^ NI5Hr4$#$ k.GHj!A U=Lu҅Vr;?90J J7{d~aqAa"<*+ݶR-"2C,A 'pK*{˵)Fzg10R5i]AN 뎉_/o^э6z߱/8Jw$_y|z୊c_]PХ6𗐲XiDP* %Q ad7$J&>[j({ Bsfvd?nJ}?!N/~13=bo;wFY7aR% . E,)ܬt}1]],nT 3j5dRI3K1V   -a/ -%Ȅvz?w"sbwv>[yia:22 ,Tzת%1IhxYKac' z3esyr zG5$K'W$rQK-9Xs?>qD.c%F{>~L'OQ]6gYꬤbyޕ0^4*?_iN"^..MוIK*׷g`'yޗeo'suw;:QUz 45m0R|>jB~NhBg/Yvx $A{LV5f)=ڷ R| W73:U.5IOiz6 6u>rڐ|䛆7ro2Dg?}XtD,.9w0Y"m>\hy .riɘ#̄Ɵ. dHBӾ㪬Ӵ:81F h|`uAʎ`˜Ύ0oodw܂M|\,߷4}UtS)i&B#i{u>h#r{lZK#=muCؘbsuts<a䰆XIwLjMtD#B8KjR2BM- my#ﴦwΞ/ǰp1.:#juɼjkj Eæjv /36TSbKIK9ǭ8E'FPiFuXy+0qfNqfH#k0rv- G=i.7d醂6#_7ٛuU> PY'@m·2H1=q뎈5rbFF/?jqLJ[N4l7}RA {_LTvzKez_u*\509TOtw*:x 8ɘ$*Ov*zx ޭK-S{!#Tִ-?֫wCjlp_:`t|a!z{)7Rճ a=|^qj>{8#ġйMz2q{*qcsӞ\¼ܾ*r^a sD$_Mp/ֵ =i~-?ytFd;.ϧ㣺kƴ2u͡8˽}Fu qcjO-@ ^Q }W|1̨]NbH{֊&zl^/EPJcl =,gՈCʏux~6Zy= 1NCJ}"'H%92З #d ZG/xS|8h n{eq̌rb0ι1#DZ'gś_6mժ_E hc,.w99~4wzU-ݖ^h{ܓu̮$ʎw l_K*uR=;FZ2ׇsp/y `< L5^0ס2JTE¯Y9r3)Zez2)-syڟ 'ˆQۅN[Iu#gZ=9#"9}fQEÙ8ur/Ԍ[qo+ 5#-nP}C6 7+/Ǫ ^-Vkx>L{ôxƓA7^"]O23RpM_C- {`a{iǏW0 %bm7Tܘvd9`v롆^Icg{xBd̼"o~DD~N?LhCjV#̌rt³ p'8V;sJ; Vp hmXܿ?^iz==x;`j5՞Zj9@*$sAŠ{5u@Pث-gLj>҄9@ QR_``R@dd]ǣ;:;HDy) !l97U@Yef|e{ tX$'V'Mm}/D*Ln|<Ad3 Z [|h䐍4`dVM-9Ӣ̼Bv`"8B4.팧i{t lu\VI8pHg/F20>IG8,}) T[ qbR c`jXNs_"L}qoe3IПAdy : U U7 ű!*k5_5c2#oW;&@Sk݁Ss,$8~cT3:iYDJELArG KGSN>8`b -ƴ2+H/aG@Xø5ȿn [me1{~!Dy[Ja=RCI)s!BŘKiu sCe G/0!4PK=Ou V fTM74WdAۍ2M*}w75I+h/}żR`~׋{Xe=$4g+M!A3J Lf)qʯu85DssFÜrA4eag,s$W0V@.𽤭O0cuφjNatee%C-KΠ~zhp[bu,邃= v QʨED.9$X1I-MːQ)R&,2K!@U)LZ4ITx*5Υs ߎGfWaH.8X~nERLx!3*xLf6Yzn(G;`;Յ/zn$g2K˻FzDKƥ{^N ~Ӹ7{m9}PM7|L,yJZ:%En~q{5ox޳bpʯvYX>.♣:̓2aKV\[Qq7DXI)^P7+ԭ!Lo>.k`ʪfw˪I+f%mWyO $?䳽[aE8e;c Oy[T[may}oZw--\CgkQ6*BxR{ųv<2֏E cNmVP|oGЊێ/¥I?pNJAXDR 8"Z1 Bl0T4qJGx^LT9)Ċ'bx#rDTC8q 뻨{pFd1#r |b"4mфPTE#InM(p@J"H;gŲ= -e;zO:ENxQpr%a.ؖ +0*CL.fz= @CZvSZ[P_&PcD5?}$+JI6vV"QuA߾1ɫ2 ~ k;$M)Lj索uv*Ψ0+}HC0Yq%BO#̇M=bz`"tyeT<JFRqƑBDHye)lWk*k]Oe?)mh6C7|i7۶}WC5a^Z#?wq aRɰl_ˋ56s bId$1ևDKq.F"@F)U$LܜGP0 ?Ljլ0AjZfy0W]OGʇڹ,Ҽ=\.D!DJ}ٓ# QZR}'&ݷtigE=f*fpߛLGdqc/4nx.|8H>76O8_y)+<~ y'x@HB\Yg0qpuc!-4V颰rLh5p3KZ>DЎ%+{utyG23B8+>id~,j8(Ta:y1$PS",QܲO0 ;7Ykq R*Z;)كI0F8L? Tg = vi?)Cp)b"$F8y* I!! M.׶Rs1ۄ[ <4m;s?TPG+~ZM>suK׵ ۼ 0h͆a:0y!?>]ޘstyc.oG`\&ı1@HPN1@!i&S 44l<_b܌.zZ`6fcYLR9ys:x3-,L嵨\1k@q2j-`Vqk,[ Sgv a8ӱ u Hl NOq&w5i.]/=^rRQ>r4MPOSMrc3o;Ie D84yqd!0y"%D*Ap_T*7JWWfTT-])Aktڑd/Kډ.c1(w 6Zee"p_@{˾姳\~˾ WW;x6i03YP?㱾X@= @_H3ڙ&rg7gȸL"#!A#!DzCE{oF!m Я ډčlIƚg"Ņ:^Aͳѽyb41PR٘Ԝ5 F`ird[~I(lkiV6COJֻ3/jf2/pX)3# ͡_:gNXC=pMJڌY&[G7btN7k_U;~Pq<ؖAP1ɽe(Bu4s_5άdzhw:q}KVr g.ؼXjJŝ `,߇>ϵq.a(B5֖X8.(jDgӢ5|6]vawG82}lx ]sA $BKst0ڢ@0sQ?Zy֗h*8-؁~_us/WciۧxN$|g+pWvk.qƈ]75{& P%gn2Li $f}uB AL x<tB[#/nY4MBШK'{OFܐbjwv2|,w"֘[o#Ş ˏ ^'-TjjGU m?$:pu c3!':~.iAhA/әYd^,{3,eR:#I]JTa9F0wlT==f,`B.'[5EBVC2%07\TFEZDYNpn伾d>~j`rmhͦ-k"_S#hp2'#.vn3:}OT'eX``a|o dRW`J%L@R$GqٺFտ_/(FFo= rvNE?qw ^K}$2#G-a>q7>ʮd{(Ƹ\ /R>b swrbJDƕ#x*7q͆㹉  µWݦEm-BZEq*$ n'H~H^Bo {-Bե_N5+j:`[?YAwÀ! D`í[G0s͕E=DO(G4oݺarx *dKJvRm.*DHcT;XS9cVڢ ]L@ &߿m4Eek{Bq?MT>PNgF綿#%X"VO,jAh(<@۟"Gʛ-TObTmmz @)D 2Ζ)JOYo<7o>-,(iQ>5^B2J##μtY'hJԩmx*%k{EEEecݛO{oyZimY6ї9 ZOtaNfMFIJSS@`ɱ#,򛙏/aYd7\3,Cx$)6R25#L )>T̛}(>}Q0į؂p9iWTߜ۩T-M8B`ׄ;4lcEv}S? ]EGPߤњ) nƉt:!3k\(le8|.\g&b΂l<,|3Pb*ّB0n)6 $D:T*kΝd%y$(A(3Α}|x,%xS6<}8d́FeO9ڸKv\)O`xk*=Dt 48ACtvbEKGvv5A9P6 18::zvp5#`7jA l4I$$fX c.=PmS۔ThEZL,5h i Kh5Ka 5jg+r#"ObeJU dRV;>JRzq:u$^,&cmg2s=/Frq)o}=FiΨ,2HJL4,V Ji} ybʕʋWmvb*G)4 f-ߛ"W4E] „XRœ"y91p?v:pkWW˿r2N.W]_:Lr0xAixx7Th G *emY`84%( %&Y`D75.TXyeha5;9R$ĕg]ȇw|?,ʝd/jJ[[ʫփ4Rٷ2__}pg=~ ~~Xj_2}خKuw5D 7xˡ5Mo&k^ p"SD-?l_S B׃` /Yl`gѰv S$j4vR:uukq~Js;^,mK5ܾYU:ˉIbQ⤳fh9Q& QqH MNlzGt\bAEPD`bRIZLIV.NRIbCN$!2&,%$[/N9*QȉR'Ⱥ>Bp05)&qĤ5G֡v%tZI5{X{EFxʘJlIN2fP&qA5iDUdF4A!.dOf6&A:.M^m=ebz)@jDr'Uj*IXK҄^QIԈG `ĄK D'Zd ΰv˴U= 2CALL.Q\5($ZV.ɗO2HD][]bDB1dd ʙ+V@h( 3EOe\FH&̐4PDhRo2"AdsM]LPCk2!֭cks:9Omm Uc8HΝYz%GSHȉB klt?=>-|0=QxbI{1"-/,+x Ѥ\A3@tXY,o< iKG+]#rR3Ig+ 8:@|* @J"X2s|:`S`Cem|'l9[+xr+`VsŮD`nC hDj##F[RźcR)qxz#RDiՅ Iv*P0+Cd4Qm<[%xb82䷱D36DZjDj&3')5"QJD#/`n.F rQ=c{V3yRnp)CuV@Ͷ)[(ɳ#UiLGV!wu#-:c]%wۯuCX)r;dv۩}:x콷Dcx .Ő-WD'/~5j~=8=kqf^ d+zF 5btwxZ_w_!e9Jl|b󱏬XtSS$ȷMYO^ڼQ0Hֽ̜ ťҊ L,z_Yy^q Ly\9WѺu}Kbȏ^_Wj՞sd'I D37}y>}r'Q`T.Ӵ@Ͼgϫ~W}?1p\ڬ?+GZH}G𼢏>ޝzj䩦-P!T0ҢTyEG+.ikh(,K#ҊcZ1@]3YP (jS{?ݽzI>JDJ %]ޮ{C`1D8*,XdT`_QØ4z˫R1ߕsոZhIb~-+-x)$;b!us0!Ce!zz4S o<<F LhX)5 S֣CG\; FKsxi59SQ"YًÅ1 #N IID3ݯ3Ahwt4}2jr0(1&I&L+v7Sk]a%LHL^z,,/F/nY銂!%>-OSHD_{%;LĢ I! D)H  PdZ`4RRʠ6R#F _H_} B@̲Q?{ $B@.Π^RXCkoDYÕUϔkn$,DtE*qnXhʀ2%)@dPP2 &`c !HY`"Qdp'&003fWd-][oG+_#"@,g6d% o#qC I9zHICz8=CZĒ8=U RFFB ҂΍1*pTS< 9/!0RI-iYJ"Rsy `犕mzUgX,P3q.?JB$B`:0_"%܏8^{(q}h+ZƗ"Q uBUF4+Lh`,<% <1Am%QhDN'6d+nf`waM66(V}T0@gAesfL Ki!{Qf4n~{7?5 ۳SN 78A'Ǚ*/ó`nO/z7b4Sb*鼕kQ*#sU]TGeimcV/mbsE0Kl;-I2@ A Z DRK Hx{RRtڂje:-yQiÍt1bom5W ꁿ[,ѩ+H)ՊCw ~>1~\ߞ "@i?t6_ƃww`!|owf6aU9523#Ķ!_b>BM /$NR e2K'k-1B9 kZ2,uVbk,$ITstJgzwSDf Uxf]rՈt2)0qc.tGe]<>7ӧ`# ~}ezw3]uY?oTD>_ r cF^Wӣ5"JQ,K PB]8dOS騘J/ 0H8LCMNa}Pp',`M|_P+~ 3*:& P΀# }U@階m{Ms I aX oԖ3HM2`8$ږ,e6M!gpC۬2m*l3`q.nuo,6VOdp(SvBl7q$DBu1'W.)RQ¯%]4ty=-]T8SXP$ y-YdCyPK|&wc<&\ZUz{^XP:70~yl~0>:bW[A>Pvᣉ-0:~y*OՙAi&s,T)Ea^rq:|*NIƴr_4%^sQ\ϮM|zn.ORw<4q<ğmzɉGl: m'[!R|q/ ^ؓF7մ1 VԖvg6@R[IZg%As7Hzfa$u}9Ƚ(J2uiRm w.6S; 6[ u YXbwH^4#(ys/J9r/J>etpV>΋q^YKuehL롚2.';z=qgRmcxR$՚EN{gbޗ^2Â>_2B#`D,:* #>yU.6@MlѴCw }ԗ'OTmD|uSdڻC9g op پ̿0ܠڹ =%E2rb34,V"X&2fU)rZ3hWr׺TXGKFǚ5|ʹFR ͔% |Rc*Z5"krXeÉ(W`-&۫Yqs[3:~4Ȟ.}]t峇-_鮔*_b>ۘ̋8K2B-GTjK !AH"K* :vd`v .7j͢ȕ-8#+rd տMx p9|Q݄JbbA,}J4F 9kSKY{[c$h.Mۿy|+ 4&ӄg8Fh{QeG.wE}]Vwt5~Pl7V}1qy-V Me索!oțk̪[1Ǐ~A!ADbӫ!TPy-Ż*!|owf6a9:8w&K)#hQpl[ۃ(E'PH$bE6bpOND=U*T UӁR+W3(X킗` %.&e0 L ʠLK"2uܫup=[nFO\gkbRb!]*y, -x!pQ䇮mvߗ"BE0_uq9mXQH̷2;(yp@cpdlLut6LrD{:,p g^kQ Xo:[r_ly\TWsp)ܑ)Cv2Zt&d&`MCtXG 8w(XCU7HTcDscQ4B )6\N-FRo7<˿i4gI$P]t<31'XjQD@/wxC|Lo IW`&η:խ)IVg'09 X{ag ǁ*iM1FX B:&-/׶:7}#ն魮;por#WY菧x.$#fV*',VA2nAWZh $<FV1FP1pZ`UOݺվ4l#C-W)5i0*<)d)0Kq8;bvXo=s4'XmNkL(GDy`HbI1J˸/+,T5Q˸s`2Y#8jTp o4+3I|f-4D {pЂ)Y[֨~WjzՆ~&_ɮ-XA׋PhJ)Leq QPI #Tö/l[LgnK!hlkmfwEX-0`a3P]U-+^(9դfd-Sͮ{S5Xg *5ӗޑ-7F7bq0eFVQ\Q֝oL!%pcrpqNGޕoT.E2t7y{BEѐKa4L0R+8*Ḍ"?aw (@\ UPc t@+)0;q7ƺkjQ/2 u[Pδx}cL k8h20{Fa[J=?Y!̅9s 9s\:hG ֈ$$*.$Ll(%iI{ODa _PuJ:TdV֔)cf- 􆞛U]PwA{UYfv̀7&ֱ D0VFyBQ8h!VF$T4[0zEG M:! .VbDaY !#BydH:y&67 QUH QW>a^hWaNyM7rbe(c R)$drU)IFm.*Dʨ*1+q5p_PMzild[Ɛ1DB{}C>f:J8nCDh䃯ktbjVA6"1ILJALpI2J=2g7mGvUZ7LO+mތR8òǎJ(pDZp–4@%9t%k{BC<7 x)v8ZUo6HG6 A!m، h^8 p^`X= 3DpT jcqjl4U(!*mYtg -8 fM`ZRTAׇ̔bt8L1Zĕ$^L&Y%G`xY&)i䆶NN[U@·~N0ڂ6gTZ J 11/.]H׉S ."݄\ B)oGvw_r1HO`*@.]5 "wB0R" nqI̿&)-h9U{]]™!wU+~\}a@(o'?keEN-=yDS ղU,FWP?2tMB>pC6f Bf kQ^-H?V|]n7%pkڧ nc2`4VNvhKZCq<'$Bo`l>}RrG?[//drB\2pu Con'C.0,NG$ qÿ/quzIo$ us3S4RSsIuWt2I+0B!P4iO[dǵ]gt?zMs '[knfBz=AwGۀ λ?lx?zjwh fZxӧ,*AeJFfҰ͜fPV^N^ FSQ_7x98RkRhO_"TW\=\J_Yš}fo$Gر&$"Rt"o˜tRz0`Didjb` ͤ<2 6IӸ(5%{2Zv1wfNxj"tj8Ө`zfze4lkhmť] @dVhL'2}f.|T MFF-XF7'~4/ V7kr_Bact+HZnD?K>Okod8ë$%GZPXұ4xm'(<+'$=Nty)_)5F⬲[ T~ ͡[RB8*xs)[&]Qo PB=-r!oR. e +hJ;߽8*)1,~*+l9$=kQehF86Zvq"̷P9Ҹj}_~&,Y]ou͗ ٜEZ?G_R>W VyB˩psykhrrlEC{V9tێGإ"Pq|~>bW&]MlpYD q:_Ek bQRe.mW.~|+Nz<|=˕]鹎0N'ڇj=ҀmSPmfu ^:W3kԍ(C-ξP hϚ;W`iBcMe ˆtVTGIn} ݭ 4]GzRnmnl->Cw _Ϭ}&Rw^ MǾיM3Wuxg-'j{^!mZgy VuZGZ)Z̫|Cֳc~^(NEs-0ߞ6;w_| 8Z^BtГGkD*\3z л+kQ^R5x4CkK{omí K/o t캕Ҹ[z)tFQ?Ғ{ (T_ohuٔKSWm?[f٥Nh c}XOK ͓r?]8Qz:΍G7LgIj1,kS<ϕRhr"Z|Qՠ&Kޠ֍؍kZj޶F@qwoȏN݅($oʬF#T '|ϓ{04ک΃E;^ϻ3O) JJ&KL%X5r)NkBŴh>U)QdR]j)jP#ڥy<)͛K_YݨIҴ^*RDp 5Mg7ڥ~q Y]JbE^-XVARe+g8)8Bl1>*PD7MK0elFW? @ʃ~zI>/]WNsb)c/voNS秉 fv>߆I2qD*L E^ cvQgd86I( AyrBqϬԒ5޸v$OJKʴ_,^_~>M?W9#sbauhb456H[ @ƌ \Dkp,*̚lQKefG32~zQ$y4Q"ۈǸ*HZc 'XD [hMA~˹FR TLs0U;p+}"O_'"G[䤊cA!Ad̑D1|͆GNBM"0) ){@[ R58M|N_Dd;{R~,'^&+|Jd:~^\F_G3c~j{p$lUaՄQxULُfr6^;WYG.h2_X]RVa~[Wغa*x<ϊ!J!,8J!t΃{/x~?XaRZy˕8/)b0,{`@`ҲHs>'miBRǛO5o<\> B lOPM-|~qɤG&?BhAl<eq _yJ7)mUް"{G%V+Z) _QvD$ʶ5r폯]7!H+Rwx<]A 1%e-ӂ l;|Bt}XL,=$oUaݔ P_vk1v0)K혧5$Cz aМ" OKdQ-/5:Z|t|KI%[lTGPMc!>7 } eDH$x+c)Ǩ& 0Q o B&f){9s1kЌ 6~*>,ݟWt}XƁPl3=_,J/xW \Y 9j6 X;`0cg9I%h=q{ o^w[>Lkf.fĥ +66 NODPK%yAE>gK3/_)mdf1H& 2J{jF},ŗzn\[::ZA1Z}.ڒ°:aF3ބ0Z kö |+q.l('*on9 ?{Drیm5"l]3HkZr ;>=)`%d ]ґ> [yD|:#¨Lp/\`֣l叧/W:zMi/E| 2x~gnW8juFFTxH!LW{ur~</iLMќ3R Y&I֨Dx1rd2F9a3d@$s(\[ND5&<|?5bU-7WJ#'trկ+JNy>Eˤh5 #mW" e_Q@ =#)U\vDp0Ҙ ;-NAZ#t3yO j5ot]$mSՙMa]tX%EmX2$h} LAe[< DJ*>! ;6ܚQo;zz8kkHĠ{{?0ၤpKLpkWۡ|):ʧT gjIG+=pqslRK!{ |exz9sqi4O&Κ*1` 4Y ODaWaC )$-;uYWO";QOWGvH|#; jp#(ىpL10FkEv^XN)adZ'iD,1mbxCȸ IׯSK"v",7OQ4,MF$aQhfRaC[)`0SLٌP2E5ϔhp?@;;E&bͮ./ xh8%RBch-{ V+5+Om5Rp+иhr$^b ߵ4Ef&|;H{<vjUiV:ChAA{D$hd:Gt% SO]Ӄ$qΐ O[ǩg4-YOaZẙRixݶMG&~_!<و-*wկi)HImإU?;x'TeTbp׌;dqj9Vr'F #c @/UFLp^npnL3EssSasů A1}I;#ۍvgCՅ=2 ǓW?x1؜~hqrTH&I& s @Pրo뤩njds(Io~kMFz Z{P6(h1p 7HZ|CCDip)$wiVʼn7{uۖ^KKϵow{ڜMg9!zj )cv-X\_,&#O4q,c4a6]=Lm8,kc b:;?:A6V=]ŗۙ[? 2?#M东nf7\76>0V?7!,dmh3[`+ sgKTn[Ґ\E֩XM#CC弅 0ZJW_j\r3Cu$y+ t3ͺb<fi+{Eօ}W7 (oNiAs7wnreͻZ.'š eaoH R_ջ=Cط*tq)xu}ޒ[lSqΐ8cڶcڶm+JĚQ ЙVihrWt4VwJFe0oRXJzQ^.3yˣ0>t[? mҕ6:):HO4jZW=X&LN,zwJq* wԾlr[YMer.Ō"9ōowb+O`KwY?.:Ul]*]vnLta Ƿ &0pPP%(ycf8E%r'Wc')ܨuIi)5kTQ~vלEڦ$mF< $Q6Զ!߹6)L9i/kf9A͋wN q=:Cpc% u]N ),4GN=(N)8Q:e'"H~lmҙس o _,Q" Ĩ sɇ^jq!8&ncJ/@$1 )H$?* f=C4Yx&H阀\Q-bp#¥QP*c3#e؋O]~9LUei7TN p47BzLdE0!k0vi!2h0F^1"A0t ;5J!EAJ@g-q qR wz`J~&q?Sd03gHB@6R*+ ֹ_&)19XdTm'Ybv cm1l_!A긬`_[7񨞰1@WtIo )]*.$gE޹H&ŔqﱢRձ&ь DPa\rƩ M'vHB%'Ƙi+FzPX}R$ sJ` EXftBLBM B!9l e:fkOw~F+$P^?u]Ys7+,V(a_\凙I*3I3 THJ'~Oբt v-lz`xӁ]t35[aJ7UnK8ª-s _pLpx!A`$A 7NhޢjA>֗ sE[†hNn`r4#acwfy!Z@x:I-BD"l %c"lWCK̴(Y#\cEVZV?KJ Y.T>@8 n^Įjο^f*=jF #IKʃ+?}p wIF;6 ٗk'U+xEu NF&(6| F'6 +)TTb7?)1Y=N:M=-ΈQ[5W9s0?+sf9VלU#0/OCI48#5ς@}+s :ɁOׄo:s/`j##jONT8JGfjY#so&7?:ć߀u}TԎan # $C. g[5Đq-"`hRLJvΞ1(.w7̙8 (gRL{qЌ 1*$!oG7ylT"Ap=x`Q@l%džic TP'aIi\7䳷q!L7#J|$Xv:q@tG<9!1 a pךJ@rY"XOC4C9F,#gQNq>٬ёGIJYMsoPր 5&JJ38& p]&d}@Vd}'krpKҳz^3 [`%KTR18 P-{F"cN;ŸdA"PT.g.H* Ʋ"(Jנ׋m_'e/r1ͼ&-uRK]WQNW/.EF[PHը#-Q>Pl{T@G*4_k?H_jS僚SR':(#pi\%1BJ3)}xg +]15χ4yDTk/Io ⊩4i^~j[*jQlwi,Ն6wkPu:kI5TQ9f;;8b rDF,V!(&;)ŦJ֍%%&|*-ng0Ɂjht)ݦKσڶ%GaTY%yj4)G şTc˩zUȣ:5w-({{g) d>p0:~4 mWUhPߖ U{e=ZkFAO0ab;BxNL SWX>L>i-]U9^>4494 .';4ὡN-t6 o iX,eX ^@sWQ=?w!$. һ|s9>՝X뜖nf'neq;O9; !rB3Ctvnقo2,c2IG? gYDa!~zmUSqf\ufLLyibɧ׶e7a|-? ~jwIf'*> "ʊ""+%xMYkuK֮ʦ1"yo=)sH|%;oiF\rIUpz怣t.;'*b(y8k2/\E>9Jti#ño#҄G =Z]098j89ϵǖNך-GɊ<׵^JlK!и'-?qyJVZVdG3KZiD֕GG[kDa={a/%#\L_ղlIB?ӿm1DJ^d\߄&/޳?3~38M!9! wwͩ|-x4|H};ep݆_|ٛ囁ÓIQ;0#u+!tzC֜7mC]$q%s;g˧ڡm|;̕6!t9;&x=ފ[!1VmdnځVSZw=yQSVoaYH+W;kXۉvҳvDwͲ>mHv鑃YjTBƕ宍>wpV;Zt5.9\y\׉6svqf#ڲvTaؒ|ēdRm||Vh[3і\c#yk=8k%YGl'Z#p~+1:F޳mϞ:|Y-0' ٣~ۦޚۘk9$ sj6Qs#>8뻱%*28SbVrpdԮeXݼK6K^\{N2~SY~)3@Z ݒ‰DʹR5EQedV;}q~sA vs|[ȝq ͦ̓s)L^Mg]hs.c3 sCJ)]kh%;K%G[^劧7N)ɥlҾxtv`kQk`n\9s]Sٰ9 T^ he f[?ϭBf^ϵr"Dhka\w;RӦR|UU5@#޴pUYY; 6K;Aιy%(93UH`ٖMUkR!vfޗsd'8420/@ҭѰW  #ބ ?ࣟNid L?[ `ǫvP`H1{? <}5冁6 Sj)5Vvs)hsUu}R7ݰGp&ᨿVuHF!n5mgnQ& 鲵gиV3Xf09ZRb9 Qs`G_n((MUm ؿOnpNgE}jgD&n_+72\f"aa;0ͱSiA7}P?Ld EhΫUdsm:,DhzyD(Ktp[ OQobގ~\A36C'Л+=අKDU)Y M Bڀk;$]0hxCN{-]xz3*O.'n.1 ކ_G /^Vl>*ǔº`A)fD,ʓ@57Z vr-ͻ|{;Uq0M, `FRz{75I"ffPe:k7}J7ɍ-J*\R332p3 $oxU:f0g&߾y}Am@ ї^>#AQOpxzAA6JmEăLkk*( !8ZqJE/>~5J|"{i ̧bƿ .֢s=_|MXLo.|ŷo.^UCˋbV_="fPxc>NDyu;^D7l=0"`Y j,KVz=D#7DPN0 /CXv|u(tS@ŠB`iՄ@)J¤vk@) J]JߎnfbP\BxRbma~}q1|xUXM|殫dRJkƋKeqO`AObiB#[]iJT>>VllzɢR&&hlwJmyF"7?ˈtf*ڂyqR2 jB7i` < ä10oM !)d31p5`w^FA,=8hm TccF TԎan #b# rL2,2:4 b!-Q^;_"3: tIk1q phMpoqHA"(Ax.AFGA̖0gi'o# n<`3D)av"X0eyD%AcAicX8v=YJ& ztcocM]׮h]תMޓ&ic/Phcْ VQP T;J`L`B( ,_VXUUqΛrS'J'+u~dbjUGlN/VSbVetb:UXV ZTb>U.O/VHSadb1>݄(015 h= 0mt/N:ɔUfy"'SgB3_&ோ@닸狙Y|X>|6[L}E? b4+Z]u}L[NVY_|Qm3(M8G{kJN%k4SRiy^&z/`BW7v*.|% {x^'EWz*Z]gz8_!ّ~1 , ' 6yY}K8[=DGÙpHa/U_UWuUWcXHCkٜj"Ye?s+[oE6 S୨n؜jlh[o53]s@5[s9gX5DHͲTKXf0|UӉ-VI3AJ^H|v3Mtr /־ڕߺU04[~M|X8h:[7}|ߴ}x5wڎ8><,mP#:|)mS \uLX"vGX@q"pZZP#`` =MM` Q+ .7:k&V"l o &i#R ގ-::&Td>["b6ڬ]F "1X;‰Y+U2ROYkP)‘Y&,J/U$N~wLnGghKvNN+EF,TXPcL.Hfb|ϯ'|=9}LڍҞk1":K= і@8FiI%煺HїК,gI* }jV1Fuh>fϨ8{x?y4*?GTvv{wm@>58VXb0wx;$7$@=ytJ ]i=mX"z%P>J-D;oyTp v!ߖW#Zoo |RD:$\Y7z:l.JTތPR:T@ݸY0E/D:TH) Z8Ce{* -@9Pt;1WⰒw'-=[X/gPys$`38wJyOSVFxLå'Oޜ8 6jA|N* 8(CR#ǜ+'c yM0OuAĊ{xXɈ\`xaOatW|+cJjjp)rOf}B4QLjŠ(GV(&p4Q2FuNTC5F~6;S3 MגW8f1j14uAxoeE5޿ =l&D1h06o[MWzNmH1BByҜ{NcS[BK ;5,(4'acLPEp8޳W>J. ?C]rSTAvYByYN:E데)+eb"N:!DuE!R^Pc{`t :.^'6@CN1c㯰!)[pJ'p 9S^ "0@@@flTn &8t`k@Č#|8 3~cl_?;`M߅4Ӊh,H.z!E 6*R O'#1%QG( Y"Ն1 7"F̨ho+A*ODٝ1 X2'zCIxg kmĜ6J#I&Za AOCF;  'tHtquhR:8lkWQnmtʳFZR6TbLQəJ +هpS8zc#Vi/4"zk1wKgoWəUp`L+-|D5A 0peޛ7w Kd@Ӹ?ǘY#y(aF]p϶J)~,>~3^~06XXQROyP硏ESS0@IDn9!1vLd4U(GgpJs(LI/TG84^hG|`"y@$E Ĭ1 钌vz%^*IvX؈Ir}I/9 %_} 5Dz#J{(xpsh:3%NG `0s֣ #X "%3fZFbtV䂞jW-皞lOq!8?Pg?Xel  B/MQu Aj#QP'?dmIB/{̒ !`da%%D) })#n#)Jf^bTw]]WWU-.EL4!CPr.V*FL(]'z%+es wNChs`1 ss?øs%5zT|m齄NP|=":h?PN7ETB/ f:RoaEt`rt+b2(d8sHJCB~~xwq$4vA&sQ +`FwqjAW??)'^^Ca}\p{t|5iz#^^'w//'P.~Un~|O&w^iKO^^eG s}W~x,θlf*4GƨY-&egu:;`U-+!ȧD\p8 ̴p~0o{ǵ|\{ǵ|9Y #$Vs.A`Sat$֕!ȐYSR Z'֍ࣁP*ec͔/ğlƵj\?f}7z1x3 %#بp$2@m]bSkq]K9 EE^u-e_ TzQiWFQ[ɨd !AqDh!#+Yg6YHT I+iry6>|O}|c=b2;=5Cıͷ*j]3KDz3{h?~5~uStqx1]czc}>&提ûGp7 ~k6[7ł&ڭ`_>1r Xd] 6|98k0fv9=B  /UZseQ:kf93Nst`o׼!%Yt`bc(4,7ZL;`S3D EImt#&Q9SBJzW5݉fL9 $g5OFaPzkG&Qd5V%A*s^҈8ė\da Ά6eksAJkE zR6+Y5Bf#_ԬdRj&.FF `:ZM;.XuAtݻݓ̻V9dZ7hd:\Ej㒉(Er^4-3hjB%ռ*IVMŲ5bTuѼ"Ѵh=+רFL:561CB3wئd$Jl]d1E4y^͙(~i U7[5dL߰npVm+2 )mQ8VjBݬmk a}ӹhJrq{Li/q~9&&^1rw]&v1T uЬq 4^E)tM5$? Ж0SmxSw|vY]4?M{dOCe˄$VW;_ɘFu2++@崏i_b'=!xZ{~z:YsEAɸ޿ dK pmԷm3[/xmQxKAn} *e$5ZG#AoCRq<Ɠfدӓ׿xYe|~T~1Ie!qN%W?]klvcՆVq~:ۓ^iv!vUr:! FP!eocepq&|ܟW[_*y9:]e djzM҆YC}Zb n *d3TkaGlŏJ#OFT!H/$G2mH6am3=%z^1w-p-[b $b5MN/xQ-QF%6[ơ8n iZN9%75:>%Xrn~9[Z-Li`MVI"|ڈ"ҍ?hVY|sa;YI-u`ՁWf\qׁ8<AB.D(ٳT6Ɉ\2l!hhK~nԫjօ#^* \>U, 5{\q77/PB%2XDh>JhHJ k!IpHӬ{nO;|F>MPfOKf'U(H9JٹlI{6쩩zQ2cs >-[R`M皥n*Dk#RJMU"kc*D툁SzP޼FJòLRLĚ'2)jHN3QXLQcXsE^ H/IФe"}L ty 0) $)86|tZRzT 9^fqLQW{~MתnOZlFnGQ ޏ$`Tf%J$vv9Ym}㓽J~F<]fŖ؜Pc _ ![*}Aظa,,ȟ:6,u*ҧAP9: A^ﴲݒvu{!t |y# h.:mV4^eDUy%!.xT+i:vKE!G|e5#~̬5j5HÖ%x3Ddb?4z:lP7l}vFtV|禊 .z:*`B"$:~ g=H:sJ)p:Q)(kՑÃ.kf#bgfU~Omg߇0I)uNP4MKVi$ezeN֮2Sck'@Kس~fRmq^]t'Cy{\do{KM {U|:{UN輀$'eABչ{LWR>UCc3i$+iᴼ;Sy,z/>)jAy7\IITkJN"&TcSH2X-yF oG/OS~)zp}-Pd& W?: ՗4xƔ3A%#T(3hDs,ƨ@ZG$튢",AHhK=c~#~:LpY50r3cg|85YU%)uӅuWsn9Rq50-ѻOg}𘣗/tV[YI7 bc2"!E2\ǭ[=%Y9nYML*mo^X9LwD:?ڐlCҧ/m'WGybr'3WG|[e6Jokgw ,UcIciN^ >~gJW7\o~kvnANloծ_F>H~2+.?]$qMHT8oQ8qIώ/9:8\CW/R}Pg# mYi1yYLEdcRD,.*iS~:XУn8]+v&v<=V>b <;Լ}+<*<*<*ZSK#4 FZ-cIV6S1!Km ,s,QR]G p K"Fa+qV/5dx 6 d[ G{dI#KeՇ$m5$~bX5_ړX fdco &rǔc2P-2_k RpLRwz_~}?شMֺ>Β z?rEH랛"Y|>};?῟~'o-͒ix_;\C-~o9| HHWo Rݟ/\iQ¹(_KGoМ_z>>EOcgw ?t*Θvxo:\/kr)q+٫&񇾛%g5!N#|;WA48K&r}XgG`9k6PAx^j*tH:;Q+v0gBN-ws ڪYp4Քz89 &tQdjqqЖ(&b>\$bj1qMeb>+hKH/Ҋ,NB4$9k6Eӊ,pR#nݑT \QIiZ>/З1P{ucD}v{a3Zpތw*m~oXv5"puq__RzzR3[8$p&Mo5|ELAKIrF4JdntbgV^5;N*R/yP~H (t_t ר޺j`rC{Zۢ/gUY3!ȬxWJ0+А32-Y1U:EAqWO9YW0<8#Zr0ʹ}2 O i>G[t {1֤;ͫf?.Zm57 *CWn蠐Tt}O:~RsxWeh?r@N"]?iMy [8`6) ps8Jsuj=BjW/~B(3C)U{$-aБ.SvטASփ65)r?7a>?{cbrLˠ@Kẓ (G/PFۻId+?dLIʈT5Ü+A ֈ u:ˠsH^% omš$i=ܔ)eT~k.[l2tb Vt@)`34fH! 9V΁̮ RVs@9׶IHWN}2~ݬ367@5hS EA7B`}zzb=#Xɗ^T)Q^q'm.=+&@ÐX F[ AbNdcX澖ڍ[gl:+zkMYl=ccJh?pFkѐmZE;5rܘ25=EbVSigR@0 2ϹI fV6p|QXJ OT]XDG=xwfSVـH$<n9BQ8t;nATo扣17&rSO%l$޻ S^IZ]iAYc\3i~D<92j^Bϐ imCkp-zѽl>8e:o~>Nnt^F+ !H_h٫fiL] kpK)IdE&3Xe7PVBB4FǪ"_)\U Ƃ(zw/^.RZëYklIP|1R&DuZ  I5 B !QE0JS)nn](vbvS+,F0Lp-~WgY1>sAtJݟ*{͞Vr+4 {<|b}}E>{yy8o뫫U$Jjuu?\$h!X/b I=ϢScG&L]$/UzWQS}fŀ[y'˾k_ffm쨔fO{z庮Su`>'g/)_; QӈM-I%1q`$+KJO/aRqiYЀ-h7>̼r5@Ma^o:W}]󖈼8KdM܉fa;j+P0̴+3< 'Be{Xdg5l?aa=($tXO]NUnr^|6Î=&i t:uUNx|aͿmYqYiQm)GYe"F!\.15dT|Y>R2sVR{,1C)lU)]l<.%)KMexq+U}/+>)#BW;ͧK]_5Kz_zG{5'7<4 D0&WsB{+Mv-$>~/g9<^[c7N s9ľ+5hκ|S!+5%/5Z/xa"A҈xoT0 I + ? l>*1APp%7v{8\z`]JIc2ʅ@_L{(i|c>+4هw[7W}Jр )ÉwǰOo?KWRHbVJ4h&$gKdEZt)1[[Q`.(%w MwJBipsh!k~}H:Z(q*Xpe?j]v4:m2(Bλ;/>P"Z ]WW+;;)]a3+MaUCv\/>QQ o!-d9Yg;jxyopyyIF}Y^𮎺y$!У{24Ձ/dk4V/L«Nyeյp]0ܒ@Cw++q-@$}tTx z(UVp\#pvpH;c6߇[E #ğ;2&j ]0Gh=i\+|Q:pKCC e)HERYI彧~_W3[T㚉q $[1d>UMy71Zae(,Q 8 q4(Njk[.ңN(G BJZ(r0 1p1~KMP/ 릷LA/%5^ zuLj;P M hOahZM զ@F c\Է5 z2Hwm͍ʩwUCʻqr|;/N0*DZ{?.Û!fԮ-{%5@U2@곺V+aɜ9H2aXPUp-T=IIvQ0f.f^?~^xvvz+k@%SP  Bŀb׃|{C.846˸^l0m޶i1 A)X" cqs#;{\s{"h宂(:\I1൷5FN\N?M{KG bgzM[Bx޼r#HnnmnAғ2G6sZ'^]lz2v2?odwd[Fff gqs7cNU:;Lr57GfW ){'?I18X. GWl}jڠȊxp@C& 3zSj1puxK!.~mڭ9Rt_/ 2oH tx/eV %JFKC@/*ˀPLsʂsŃߏ]2UÒu8EA:!\>^w=hs;5-4Þ;f-]Jûa;km`k' aUp4 Pt8Jycy8[?J#A'(ݿB]f޺J!Zf+ jtx3J4~iV@P9:4-Onz -<:3jp3(R﹒JPI.N?j}mR=[ zMղ" @Wihգ]tjO -t_o~Z뚐k$N^Ry!mcN/g{.gWK7.!o~jteqNqh'"F}FDzUtrwFɷa>|~}vzJAMKPӅHNGsw=Z~>߸ n\ G7!&i~,Nր=y pR2f*>wdPiW?jq26缣qe<+m![ySWnA,I~P%5=~1-/~>uQݣh[/:y"p~1 D4CzJI5By.}򃬇%?h\ރA{К086쓭?98}]8 8X\|ҸyЇ)82"Yd@pyN; 79.xtM00\7-$[KF1fW95>\SR)u6ZYg g+;8{EB,=pU@G# yWPL9(:n f-[J;oE(zyZ`H^ AtQ|@LWʍ=?wCF 8.ޏN kd?%?L\B g?uJ1GCSRTÉ9) Bdag7<1GuJ˻͡BIO.͗ q>b{I7*ž_ߥ[Q >o]dUSoMc?Kש\0tG9tXH@ꨤ!j|q$G|KR#J Hzĸҽtk3 ]͋q=ѣ FרHexU8<RE% Q8DVh+ġzhГrŤZ k>V߭ǐwmIxل~9%f,_+23v!!_ȔJjltWǞBa??i{3W<_-_}Ig)ccv*S1+ƋW>zUtؔl0bZRz r]=7Bs,J22z7R] <g'_WYT}$'|gsa(y8({k^ߐ rIW{hU|̑ aq2ΔG53A^&ETk5w^ z/"e#<0€).mPREFSƀV-.%vJ»Ob{ Ҍ/!c01+c VR# I4pZQ5!U\%=;)܍̴a1Ȧ]ZX88_uO8JQ|gmv×dvрs jjHU׳oKfp}M2GWK}]^+]x u3R! ӈ}-̔Y%Ȍ5^ 3u[5jjuG6vW ܇!qX9wz(Hr@ .8U,.2:F'/x-$XƭB˜?14וvjȧrA =(M%k:@W::-m|X*dOӾJ51/+cP4?TX#<%]KϻxFFT&S?j~0\䪜'W<*MWb65b3[( *i:!g>2T@ Pi4kZgwF]3c9}NJud/NcX ō-*]#GsLH:Ў^mgY`{į,=ϒ ~ӗ%lWRF9~_>ӜA^Vqf=І:]@Fڣpho$yy|< y07`+t|H#JB}%WP F:+m+m|oTJK/%}s-gZ۸ҧ#4NlJ%Ml D8W߯A҈ P̔SEu_ht/_KC+&<ښhF%VՕAE^.Y r1 6\kl\ʆHJK%tZ% ,2hir)^ Z4NmS T9{X c rݸ&Me4)~](ƾkfRj^}' %H3-U1׋Oq&yoяgŻW m 9e54e?H}fl3V O\C=fNT,ӓs}۪)=H+<kt|'W*Z?h<d{}:fs1XZc++;f _$fGN,dV I9!kTGkkĆvɃIF8f](׬V^K+3[ (W3ԖVflٿciVv&JQdeЮx%Fԕμ.|駱]5+7"bkVT7FaXH=8<[\@Pw\ѯzt=CǙ=z8'H*;KE)pȻeʼ4Sٮ75m#pi?Vۜ5us(iiY&5f#$#Ŀt!6>ݨ*}bg)R4cjnvc=)wzSa7Q#,4";!uKGE)D!vZR];?{x 4v!31a/M5dj{89]]jz{|| H*vn7.o[.ǀ#6Y# n/s~$>C7+ۯxF$LIW0 w1֌Pi[VYJF-&sQơ `WδPz ZC - (Qq;~LLG=D$9nH!mchl_t ȕ 8j_QU+} b#}ZYr ǸZ rc^XF G`In`cj$Q4z7\mR*::DhwLK78Єocbt+qw =?P1WCcv|̣Q (+2##ʍi[lSr"E6b.&Oxȁo*mxNj3:c}ۂgy[j<4Q\Z@ :2s[r%s9m~ى3qGn Pv"Cw=zK%2T Lk6E҄ܞBAGaO)Krs3*|q8MN /}K7Fkcmn=AJTR#CyEvMx~\% ol,0oBd ὶ&;ahN<>;)wRI{;81m2Fj$Q&%J$lcqZzj䤌hY ^HSBm;8}QN}_5#l]{ug |FS?5)S?5iOٺ\b2is c9BvikpR(+ƫٺzuߗ`kr|}@oV0VC8*SnYj_9-%os.;Fu\xȵ@Rb#alćVvTceirQ,XuG#[eH Ԛb& |&بhRbewB6x&7)O2vVGvy NH&*LXVc*iI!DΒA kCPȫ̰f݂4Z}1[}Cw|,3C0Pk}̢z*G?'B|=~fmGi6'4RVrՌQ# s׌Ri>X>8f!t,wutĖe!jy"X ;g\qR8)D"ND|ж^hM4b} HM#eM(E}/׶˷fg}h}hx5ӓV@6\Ўp!F4`DW}.bƶsK2YRCt`sCTIgR@<2Ƌ醙Ĭ>"ЭBCK^I(^kvع1o`0BwʘX*u^TĴV.= 3CJw oAׯ(4 Gvo\E&rƆĻ/Wi`:*{eQ7W1} h볶8 ݟ=#͋ (xVEl|ni%%eF'S,̖2!hAv]PbpS#0N$+$*`b8P;UHB;]۠V1&@V:p eSժBNgZ<#3SdE'56g9k,ڍXm.3`:SBedjLiBJ$ʬY@dGɄ1=A!F9{yeX' - 91YdT<#QDc!n @>gسBQzl;S6n94 2TWci]Y.0]rѢvq1A'2F%V7jkR$XA,$f1#6m$SZ`H%A[bx^w[ϼE@̔ V訴(ߘe%贞eG& -KdL`'fp|oAKǑ:Y4J v`=3!3 +`. h'!sL'_wLáy{V++gE[֧}`-8H`XNqa[JFX]lȲ&-*v  7YHazg ?%M ;{_Naţ=C,Vy^*9(FD!;TR3B刞Fޱȫ.?S Z,/g7O!를-GW_Uok~WxDzc,,(ϤaxZ閽v„zRLI1'mmW%yV(O?f@вA5VՑRc)(][s3Kkh_x鬍6b,J+KMkCd?]^@J~]6w[vvL~ 2*}I.Vo߶Ko;\T}/o//i?=TNd>?)P;T۽lmY_ՙ~T}Y+T+;`MϬNoOfSn8Cٹ Fe`!ׁ21*4. ʝL1?{ah 1\m&1v觉2ފf_֨QF3[B9P͵ݾ68 "0z]TPI`'QVgrzr,W݃0ٻ޸dW88 Ag%YJdKήSi8Ԍd< edץ1 u{q@B+[Q9坆 itQiL߾~_A)GV#NVP%y)g-gQ߯';?4t92%b|zN~NIX7֣,u hty(joFae|+ OWK?ݤ{__ӡ_FćvsKe UN=@Ow%PFOoǿaɚoүn)L۫P=E<񗝯ośӳե=[},q!+wvKxf!O-x&âvZ?)2%a)0H #QmLϋE{?ַj5&ӻƅ/?,m9%7K}!|XCO|DAh#_)t[JMPR}Jӝ? %)ҙٜIfuD7 ڡ:f@)T$lw;.5ns>lu>u#?DOe P;Y/,`\ zhm_=Ci"3^Il6ܣ$>ЯBI)?B$n`yH$B29xWyL#Z?B> BPc"(hp@yC%Q0,86M!UkeTECEAH-G(Mҁ C@K -SHfC>]-Ox[E40ml)9c?i˕DP9m*?2Ҁ 4peRDf*4l27<{ǟJݛ9@JB&?V˒oZtWa}ңhUiޅ(s2ScYuJ9r6fEM!C=P~ !!|{NKc =IW/,-?D5y 41K^ÜkBWU0ꈑ6V Q_+_ :iG&<,h*ս2" xcU'4 X+<.CA,ų)ò/'' PW) l$i"v^._RˉW*zT印Z^MWb9_0j;b["% +?=HA @O% zfr_Gɜ\omLd_eB[wyvh3OSXG8ߦSh)=nt_Mh,*x)FP9@J࣏襆;K~(aJd=kOMx ]2jTDbQTuQ 4ndKw=eܾFtheȖI,H >f҈pT/fȱ[GM& yOc_ˍk݉I | .|r1o։V8/-H̵$[p#Cb&DKXDݮ[}P*@P;7$WOX7🔇]_}@7b}>ޔ_78'McjL1ᄡ8m(MƑa *7wwW%>[&&JT̘c|ƹ3-5EYl7=Y3T*l-c0Pਕ2F]B:m8Af pI1;TفÀTϗ„rSݓ^Jk`1, :A/]k"8R3w7vZ;#T7ogR`Xv @SADCݮ)512RR(xŨR*Q踪4!#꤫bNTjn<*Y|B'4 Rd^h  ȹ"eh @͞}PMzN, PB+{ 1\x FtdPrCըu8Hj6k4,NaO.XC9U1dOU.?=PIkiY6F/ioKZOO.DAo&-S&b{{ؑFFRII螾[j7&(B0O! ^=iݑ9BAR8evg~^εVuNʊtA Nl@e>D@m%w 6: :uW-Љ2aZm$vda!7+o׵&ZS!{i jaq+-AAhQ%T5-hQa 1dD@u$&AFZ`$d*##6si4jR !P(c ;ZoQ4 _#1 /j4'{\jbBlLM'űnGkM>nw ݘ49bv}c…,w<.IsgAi'HЯЙ(zDEĬUc T"-Ϻa Բȟ6C'P.A޾9q׷)h߯נt>9]KKuhs1~>=o_ex (e<_0$[UEF"|'`[Ok۰a_ r?_}~o:K#,U)tâ!e@hNԏp֭9S.mWiklVLhuBCr-)#GP"v+Ac,Fm}R Yշ L|^Sa񌜓92( >W10[T눆E t-]~=CGh5( =%1ybA1{YP2EBaB-biܢ[,% EBlBN4ve)J| S)MNЧhBg `X}!H` {#1E}Gj5tiߟ0. AI)m :Y0Z6Kr:$Srˇz[B,q/ΒorIr%c# 9ŤXb:šdN4h t1jt$ꟍĴ|zrIn_HL%jȱ%7SoOm23JF5PC8;ݙ̙.\C0M$vdd xy'']K dhlP\<\~L蛻uZ䮕")8U#@;#WLeeTΩȁZ6o2k"\|^N>طJָd crYVK֘jԯz4H#+;9xvv8oj^CzF +Oe uT04hK NɺFR) [BGV)f 3d×_M+KP;J鄵=D#%WܤuGCԨ &ldsW2hvTyS}r8<3EZq-(7:{k646\FP zGCRdzφ=1dC-FI>IFeU3/":8p=I_y } Gzq9. HJݧt=4D5zU}tOK" D]t 0jA9m);ɠgksK1Rgzȑ_)ia_;`]LϢF̔JU*KnodI%bfY:'TG/ŏ>OMSqU`<,eqˈfOv_b$w,-dOfBܺܯ$f N,hӲʜ4*HN8"0ϭQR.?N, ,..n3ke! 'T"3%PtsKGXqKLJD%qcwX(Xtm@T FfwQjǞc I9l@/@FKG 4d8(m k{p3+FϞ&}A8e.9%xa3~'m ֕hMK+˪H'QP c7vM%V'8݁{}qS GbQB{bpi S؄oYEѡ xq :P61흪 aOM8e.^9<%N8' sQB,YPY[OEnv:jh `;9r@645y= 蹬ufUq߄ g8#L%1)4`Qcvwa$![l #ouHB^K".Y;|>B)(i; Cb2D-`mkY65MXC_uN-Յ||sNr\le&BF$Duh+y#kx,f6 \f"T/Ddާ|pt`L U4 Uڱh O1nouC 3؄oYR ^aޔLR ETBN#2G)ӏ*Khr;NV19<9M^8j-,Ώ6(51D{abF-'јW4 (y }04;3B=X${7N^*Y[r0sNA /rtZUuueŻշ/ϊDeb$eA}qu#Mn[ɰ\DA(/isR)o-Fh=]a9b8b,7krtqZ &BSh~PhܦD_U{ʧߎb[}f 3~.էS]vCfVrvsK+f+v\18yߢzL7VWYj<@jUgv6K8/-Ch~4k3K8_8?5Hi@]5[ 9?>ǫf=W3 ;{s>`;Edg{bSpھuBMC#[Fwv6>qgp;,6l\u2 9%Y `ݬaY]̟'iZϐaG/ه6X$N%1u9.V 9nX,k DƥR0`QT.$pCYnUۜ\n݈l4 차'r?"-)شWHb$ͻ4dlcQG`/mT~z]Mf"020`Qq^t'X< Ϧp 6\4೽I.Ut@M{2,TpGٍf8S}y$L&aƜGsl`[$K{V?m,#0oꕲZF+4(%lِUf`Վ-='`t?i)~yq39݆>(rHeu'kYdyɭDnwxأn\#30*%)l #4>x=ZIvTvV$LbXڟ.U\{ eJ>"bS CCk)y| X{O#n7Sü35Sp~b {aaK3$ʃ._-0Kja3:g{8J2.Dx\3.7a@fW gu= orI ZTTǒ)P/P6+5,)m pF:";e\n{{ }!w$M1$сJy٫Ob0D%;y'97tc}"V?Ws+44m/i)3s}zƘ7hyίf-XepZ.6N#z//G`w! I%2wH!V"e1Y),ӡD#s^|>#eѡӑܝ^ q񏯘z[@aX%e^hCkuveQhS&A-m0RHȎ"!.LaaIaLnc|1! M4ot@M︇^q+{,pS(-cZ[S+WpO^E2ZLAʔ4@#wwrG}JKb`F@-B:^,JF2}z=}`y 6sS*ہokl;l:7*ݝ#YtHmi,ڇ9p);Α`cUGwZu"X@]UW/./?"f{|;u+9LBaB).f)1t"wkɮg!wz$ݕRwAwrt6|-e-l:~<>cJӴ.= c{y;D<Yi3W}?ňҭi7ZBPn,g7XF]+n5Vbycy-X0f$hfqq%4-5HXcm+76CNydSLNTa paT;2-p6vWCz5z lNC:rd[߲ηs]k1'x0 ^,mf'4/?3%I4n1Ewgt`R[t(ٴ͟FDKl\u(m$J#YOٲ$Wk%1`#RƺCzs"Obrt@{2#8)'eݎy̆fQ`tDc:]IPuf֘6l/PR+˅*(N톦>Rc2a29^ڙی'&HZUyxUE!lۆlZӼ1^50L>57#4ng-n7םwˠN>,­m;nos}6׶XRujqLG+ɬ+bhBRmn0o*65Z+WtQB%^TSݐv-P%+Q۷PVΨwWVV&dfKQ.t{)._ɺ@'7 SH{t˜>5suj8Dz1meNa^NԧvHզtN+#~[J(l]#?m'"z^4Sؚ遪N= =䁢?ZӘO1(7FOC#pU J|}O2 َHd t{juטCEG֬ |i0p #vRXzN|>_/ z9ʲf0RnZf睶 [>3wB]WYE6?{)~vc:;vU&ݸh}?s5(s8`Eai[ys(`Amow]xqzow۝o6^hp{CE_j̓WnAtwcWo7M]z#Tտ/Ngݹ,bSI [sLt1d=GBM?Ugo\WSܸJ?'F9XV-V\};l:/@3')Ϲ PTzco I1SHlk9t.:s|-7p-OjU*ZN}*MztK|q>}UVW k۫oS1^Qʯ'N >?Ly{Eu+妹0,n#0~m} *wxQH,i4N~ [t(ͳ N1.LoBcru{e۝\ëڛ<w隡{w( >|~5'}/loC7$EH;WԍWؼ ijǃy Swy3ץƛuu86.n\vF޶[vzn %uʅ)i6_44Z?}[K}q^< ^\ ; ֵṘ8s~[7~a |\w Ca 'Yk͐W_U Bb*Uy_]߮Ks #,ȿnUX‚`W'!=텷L3}F>ș0RQ&cBYGygl@&}rCs e4ZFS'SD90U2Gf,Fxi!r$ +v#a֞12P`\Ts WθUR+UR+Upe j\}O>̌Xdɨőf D9cD,J_4NwJ납FSZy_bPD U[-E>=`f,.[KF >b-t\qUtSE7)99vΫ覊n覊nfuM 24VŬXt6MBjAf$P',&QSFK0U"!飘AHzA ިt̤*%P=s>7o=kR3۹;I0Ls/ԘlbK'1&Λ-f֖Ft_+i*WH0}3gc04zu3brSL9NW BdMu"|̣s禟eqɡ=D \yZTJNظ!@u59hKlswQ%0&wq0|EnrcM8 ɖxjլk_3Sf@cBL))3]fM:͚,4kUs t(A_KEFO'3<B-,`S*E$ P Ɉӌ?m4($Ez#SyW F*G_PbLe!TU{nOja3D)`-XHPK83*xA #HШ}I?SR&upx.uK=Gm;PBZoq^䦔:&)RHzPE-c%WR<}#3bq<*ĵ^ -qR}(&Sb(<,h- S*hT9M0HNo`Q0|DDȞK#\@)T8_[4Cκ *pㄅ~}$㊤%F X3wJiS0io"0cg,m)c # s5ad-\[9R,t^1=Pf ; _˨,prJ9f _+wZߢ;p:gĶڣ0@O5&Vf VOK~mAYD ٚ!$p|ak\@ N!FԄt+펛[ v Q`# <]=Z4 ٻ.jlQ)*o`eⷴ*vvqcD p t;t{v;r A4OZt{?筼{z[ixBq1ɸڨ>hD@Q@;C|hR1Kv++[搵ԈC=ORaELv9eG[0ˀ ̰@  9$MkJ0J(JKxF =lƿ>V9S֢. *89YkCk:r `bRhBteG0 Bxb-grL2(AɃ8 #QȈ] :@%kuc)KbϬ9X&O_J&{d?u<6Q]M\tI"B:@%vxYY)Bƀ1*r!`*2FNɍ +beS@oQp1A0  9½XfMZ"Emk `T]&12`FAs+5# "$Q M[g=6* 3 l1SiZ & 8 ԉ/^nkaB '[*` m0^^͢edj{¤|x hEQ!̪k M [kR+4DKkȠ5(Xz+5'='Eq4l喆\VOh@u.F+ \eV>xg.a *kOe"ZVybU0ZWo9&cFWCy(S/[RW5ba*-Nn[sh֜ -89ǯD}w9Ը!&M䲻Z|vyoR^5ښT&ɂp*VSk "HÓB氯TjBA 5Wظ!@:O {R*8ͥcH!"WHcZ.z'%eY ce5U2zEML׀sF&5Y. 6}notHwXp2 /Q/ZNd.CYĔ  -^M9~rS9ƛm~h2O|3h (^(MP1Yej')C9GiN94HJXK!i/:MH>$uΓwHɳv (`G8 V;=踓X!ғpa'#ӁF@'b,H^?=\b|,P'կ? Jz}uS"zģ$c{.h8 wME5eCBRkZIS%IS[a宠Mٿo rlG.YW7k{6Z;k&͵%}aĩ"#؛G zyJp Ɩx7= )AP5fF`ښzlu `0 ǢZ}brj`x$f"i;/b}6XB?H 6o߹<@KܷDHN"Q"CTЌP6GB, m8xGbٻ޶$W= \? HEذyi+֗%ZvIIHJ$EZv"[eUUTZIVL~mٶ߶+?ȟNA(sm{?*Źg&t璑1>wޮR*.j G> w\R}=/m=o1;_a~+[t^\\ϽW!&^~gDځ6VZHV_Y*kW-;}=u-jk4ݶeoͿ`+u}ڞ-2BDi󤝔V} =ytbc]vJ|~f8mc[>\﹕>\c!olzBR#w>{f,J6R )Q:Jʕv%iZE|=>VkHuKu)\ A8\ ڲA"mxKm}O՞o3 6o3N5acs75Qb|'RL`Lb(\EpW/}aE}ts+sb{։5X:(bYB{C6(c j%Qk@I. Qr1 Eq:>x 0GF* o^_|Kj36Ei>۳8#zw|O/0Fp"0^k"WCh NyЖ-S73v@ok^g7܉ޓ4Pyp9%ݿCO~eQ$w 3#jG\Gd7o =sl:{ߑҦJ/$~ѱ-+mًfZ0_ ?u@>mv~Up1Dܖ񐕬6@'uT\nAJZU^Ei-G;ĭmB[xi N*ѥ o+P 6lKے_xj-)nKےⶤM__bT-lw?i9i1tUt%,(/X?eŷ_V\J-JX0;WVg"w􅑖]7Ɍ\][Ó]?}gl71}7qu˿$-iz3xVO7A|ܞy >D(l3?`1E: 5V柈O.xPs|ϗ'%]6|Es~/ZGGGoG}Uέ[[Yn7qFruZz`o}ݨ6){O7w"t拀vY~ 7s**BF,/ *kVҌEoh%8ȲE"2OE%<KpJҕ3xN xu؉~g #l' (Jn~:F)wƿ@Rhd mG\ԏOyӝ2zȳ/B%b+'_?W ~8(5;y  -j Ψw]QAPezN+ y<ᜤVİwPbMEE!ZQ>9ZIOe߶}T Õsí֯8E3uz1[$Vu1FatQ<9-$D2zHO }^ٛQ^ ?>!Ld޾eXbYu`&[Md1UOU هn=7l{}0|iu…֨u^@ dlbрdΗx W(TlR9:4U2UiGPzN;OMU T`* = 5gSe6qRm0$ *MVT;/ JfCVj~YR˹Mk7 0wRu,Xʘ3 ο*eOWg3 ,+;Be,{2 u"Y8F#Q[QV7] xuXKz9ia[ >>q O{`5χFVka@^c=)eheFM`\L'ˢ1Z(pihmR)J#-@rŰo~YFOik5&WaXҟxg,Ճ~_捱F&g7]-}[N^zwMj|Vp[:<ہ]mt-h$ ~퇝A?=^XqEnw.ܓ{{f'2;p6ihÍCplmF_h$6yAZ%ghz8h鼶 FTY1/e;; Q]KAARR|sI]Cʑ`ML0IIDFk3k(`I"y&iIbEL c>{EMJs4I]$% i+%8r4'3Rqme5.((IYj̥pWR˔b1j:֬6JX(dDf.hYU+VvuIy2/Vv3l8ϗH 1y ʯ=|Z;ʌ` eoWv^ Ici"b0R,5..E;wL yԞoXh+prcac.?>tN%d|2p'bgU:ꤽ}\:] lSwѩVF^c4K5V֤\VQW" p 㾜T ;u&8q--H:}AH(-R:TD8%K ̔eH.U #ے=gf,-+H{Y盵R:>#ڝ3 l@KhVzdR &ONp`JZ;,M.W,`wq]J6,_VW)L^%QYj *R^~VEGw(˙C#_);2u&XMhB16*PO%)l(xiB\k3kU .VՌq&^zR|Bɖ؄8-S4..wyglT-"[ I>wA%FJ=, ߈Xn**l<9*!;<9&IHET8jȳߵc )\7G.@/pZ#n՝Sm%X2 l Tn+G̫^\FO%|Zb?8עawnb֢OT: C l|8& EjVBQA(;b9g`o3$޸rM5%S#h#@w(Ѵ=)SŖ9n1WyY6:8R/<y֐nEK%}n#\smٚ'u /=p5TOIFzRV^A"]upiZ ( |P ]gI W"=65mKԜ&geKa$rKFQɶEU$"&x7AИ&ը%K.QI a^wY_$=Z)eJal7VkږMIT6r ܯʥB&7C295CȤ(In=,f -QƀLZQ5ږP0רm֔\GהIM#R%cC$gc7*LXT2ш0L8-PI- !kDs AS2!g#mKNL02KyշjDԴeпػ$VJ%㺑DK0ZD(k]I*(Tc|mٚ4h+$3PJN]pP$肤|)%< ޤlt$8W)I)>%1{㊴2N/:NrOп^%$Ź0E)ܘA$6(ղO"k"k $nMmٚqH<=Fl=}NDb^2jVg!1T>3 Tiz\gY %,QSwD+ 56|\Dd;RĉIZ Ǔ?@WD'#Si@B2nMh껬':ua<1vWRwj+MfFS[m7,YϦxH7E1;?(OQ9tڇ,6сSMe9fQݣ5S4e)3mH%{r'A{g0688ĞxҞ'Fyri.ZAqE%|=k!ơɰfX nsjv̖{P;رyjczOC#BdzbЇT3a[Cr-DfgQsUR X-xP"^?3P4dBIm-)FFB Z=V& EPc9,TDžeer`mt5eҡD}4,&oPON;LE7г14K]*6J '=k<F$`tDR$oP{0[.$Htv02M(3@w UeSTj9n\?WGV@٭/WbUGb薨 ~ɴt'J! T۲a[S=. &G-h>@G'/&ӿV:g'f7@n XJMsk</tbC~HԤ~jt 15fq4#ì) @Zsfz(=U-ɚaw!&xXY' g12l0 hhrM1.kb&15 {'I2Y^n,SP޹FbSf>Mrtk3+X (8 O-Z#1g `Hڌ9i %j&I <6g5.EewYI9p`W(NhQ7rIŔ`^ШiB~(zЁqcFu_ 0[1ͪWBn٢1z~c|+<ҩ.*k uu/fwW,N'iLݧ)Re#n zLf֌mKZHYJ ̥Y&8aц҂?-mք4ip9uI7JF%hT1t1j…9VBH;Ec"@`Hgj5APj (E#Gd7ot/2Dܲ5|n d@1P`ofGUeR}+l5Ð," edEb"T. ֩yz5B'WRW;aBH LHqjQ +FJ POF0B9"S*5A>U rP]YTWhKԙѯծ#BH 1j[fXKc/O3u[(M45IQo[kHq;(&w5ndCe;mp*3r]؃)'n밋vNhj[%]wC6z|r s4 _!BW6qDbq\Onb42#xEiO>}1qnEedz7n_rj8?Œ/q˫gtOGK;z &![2,Y6D.˙A%G=ɞcML&yudR+#%\l!1>#'ԕ|ny U3p@ƹ4 x%`v7|;)yz";҆}h1oa|)({wy Nwd^ؔX $ ~u<9AM BE;R  Hxd9 VT7[::\?7;xA6[laJB&aeJsJX"Y2' ̃y; Ȓ.Qp l +Lɑ~9 %klH8`<-4}}HɾN0|\_gӻ밒{^}*p67Ow ~`ߠoOfO]Ƿ&tCߌGXʜJ2 ME#gϐ*oh:KӚ3^[MCIbb}] tRBzImUl"$WLfFc <*(cӦ-V;ԝc+1WD}f$Sp33?m3S1{; 1ts6x|Lඔ?˷8IIbv>G&7bQۍ.FZёb'-PߖU(leM֛-UY)k\CkܒtmH\XTOWi5r:f,R8!IJ*3yEH`f01n5qJrY͚zoc=L}j Fc S6at$g=Y) Jqoc Q!dND1(ԮY<#x>\ݤV+;AfhOQ.ZA [3{7$sR|qzT聮Nz6W W4ЗWHX $O&7ѪjX,î4m}(1EI0{E?}i#WR [eK wpi^p~$֨T,AI-1K$DIƒC?o7]yuߩ:wpZZx8.V~ 8Sq3 (N=p~5i+{)կH5!C5PmYx.@ץtפwIcIs=T@}05 ~ b5c(bsF=cu7VflM  Mӯ6 nx !n=rB"K6Z.GtcbO.F>U 79qwq"\.EdH?&F!x!.<]MH%'w9?yVSqMc %ݧPHiLזW/Q>%TF2^]*-m _֟)&˸w/o/׿# =mt3[a9cr{K = Φ) \J# s^x{!YAHGV#Y-a 8;A]Ymd;w&_͍:R:QxIi}YptXY Rw3G`Zؑ>fHsyWD\X3d_u&Kuq0s2H2~OjO gU7;2ANZ\y8X!8ᣁj:#`ڀ~y}z xbT7zk:8مgK7jE$;R1g{)zϧy5($WN7C;xam)P%|;|@> En1O3((Jة޶r"TDʒMuux^x:z5\ץ\PdhϮ)^$_Q*)G!T{_JAINzs%W@B= cP| N"DIP,^B*t%m.R)B\, fhy)e xJb1"jZI(+%QHPS|agkeO_e<t ?QV,WuZOɭ5YSwD2`$tg[,qu=[k*r 6w~S?/t `ExE$/xfva"TW[)ޑi¨FI%MDU31f\;E:^{%7""\{)s7YQM*Ra r"F9@ʈQeNR9 wABNA}`Sĥw ݦSΪ:ޛnZ[VÕ+'&0+,0ӛ}#K~#w >czU=V 'nksI+.a}_#f"+;xsDHMPWI5M3aYMtE^$B +&5겋`E rD -9@e˝hq‰`HBs|Y\ ?]Ʀzʈt3p 6r g$Q4g CF$$ƀTy%sl &P8x8B7.8Eb47etxt7ٚv q-$: H_xO۸̗؊kUp I^T\?uXϋdFH1?(LXP.*FQ { {a K]6 X?C 3 %\%b"{= 1zM ڋ"YѾ]=Q:bO~d_V삂5w~i1 *V},Aj8 ΃ ]ICQ(I "g9H3R YsRsS(,4άv l3 \`1\xXRJx#<1uC;$8[8O,5$׮Ner<5F<7Z刵l\Wl/N }u={p㖳)" w}ƟSRۓ\j,+*` t}cDsقaq!$C}~n0@rFsvBҍIZqYyq\!;-@-Gk%=PSQdzza^Hᬤfo $Eva[\)ԭt-xҬڭjj oܵ_L j# ;M:<>.K+r0M3?{7ؿD,FqR80~t 1Ss`뉷OfO=F,'od.>^lȾ<""P43UKeuܘHPHڅC`V["7aB3*Py˃r][5-Àa0;'~QY4@s`JI5\/N!<%`%3Ki#DBWs H̨r+ CU͆S!Z_nAh%*T_/L2!mdǘRvNvVªzul4wՃ 'fJ)<?,yO!EN5JhFUŠ\'Pk U\\Jb1>rmuyS FeEgxCIGGyS|˹90C(ΎzD!91p0K}EJ`b=Tk8:*~N'G=,g_Ѓ:r(}8+NYE'1WQ| ё?B0EсltܽRĪW2ΟJl9Eo7k;k~3Ǒ~_ߏ2t}4Fqy0ݎ*}uuj[PPegAUg\ұi⾄eTJgI/ax;-ZM4.Ci (DY8Au{́Dv ,I5I -6pȒou ۳CpK{Xn. -?9:w0")e+A(?p8FtNfn&:IuN;pܡ?}<U_7 VWQQУ0CUD}Bb0;:QNZ|֞n/e[{~@Jn*771Y* 8M8`߸v`ޗst+Tyh`5.PbDK69Gg BS`iDck0Zq| != x1uF8@IQ |;+sdqZQV"sʈYub'-/ٓiNZaB#gm5˯_R568{>=ÜW{ݴ'r N\ewqmTĂ-R/4{$I5ƙ)0eiPIJ Md SeA@ԭPTVpy[:c1q)NcN3vfDI'vI*JQQcGkX-wŅF!d9r89 >XE9NWѰ6,@vrSqWN/Γ&H蜄0 |Lqdž6򄵂><7бgC,7h8 >Ef6gh[Sc8OǼs%"4* %C".A@`lBFG/M"_/~7l|;NvbnovOU7K`H#ss ì!T1,U{>{M w$mM ÆJx7+A)a펀EW,֗nl~9X=FQB$0ړLsÁG/[^2K궽F@5vp(> V6c ;_wfe 檐YmΣ4J@-Jy49f-Itj;w[VH-`[#8Js$= #*tg5_}r&y,zb4fΤiOH:Q.>Ni K:y_$%e[]hHRTrg}y5ukI"+K(F"2Nљ. F>a[SS.1CՊ~SE{9C1Ҵw0T'8(\!i(!tR7!X\sު7,IanvtP U o/=e'LhcT|ۉEF{_cj kn~2ykXPS}n8)C!7^pR vD_yf+ZJQKieg{hU']~Bnr,`Zsŷ@|)"hOFcaF9\ZymnbH*aG 0RABproQ݂Bz_-ei4λkz`z߬0|Ju]rxjJU*ܮ u lBu'Q[~ yD*E&) RИ(`t߄!+KxV kHc.b]P[XRsfhctRr靸L}ZFjn"ICO.&Z^3flҡ6kӍ:T酸hMYăn.{!f{lYN=>@͠b1ZӋ ۊ{޿ҪL7*F->epC; J) n~<>$⣣eѲpv23PbT ̩:@a2>Q5Lt ?JS kINޙŒ|pt'3zo2AƶT0cZ)`epxC USע(L >9-`E`9 C:'$F}P)nJ|j:!ԅO"( 7MW&.-‰aEX@jKyyE(V5Z7' Jx+E||>GwΒgߡkw놎xo^a忹v:&3;VCV}%30&ڸ?:Y2so2Y;ʥ-ry'0')zI0lF,owLCh(C1T䎱Hg:|x`0J 6yΔM+;h#0yy9]TYCjbqt֛ji'ͦXl&!nXƛqqZcz~_wBO1B1TψKQ2l^ g-5'pAs2qJ:d_8q]=DXiʵyRY "i>/QDȈ E1 &d1* cϥRcd(%aաWwKB'76v+Mn>@)v utd 3ܬ@ bN]$Z릪\W%n~jXM |vynGg@|,g.,N8F8Z|p|meTS'*+ӄ EV$SBBW8aYo/#^2a"+Bf'p+Q0\u7v-፹Ѵ[=ukP\c PtاlwPM fa'hքXB$/di4<8 hg | >DD=Z^Y6[s`7%IRu ^S_&kD:6m:[kqʹ' |VlKbKJ*[#k,V!;k'a}Fl?NGmoVhi(Z>Ȩ d>gc 3 m|FʷY9aS1mJ*GHy o-g_$(yx#ygcI0x?G 8Yp QlGo !-݊{K8Ѹ#ѽ_*eh*oPpTU;tqJ9L$[2jJ+߾/ =mƔlŮe] y&8AP$ȀSe(@M>65DjE΀lqd}\5V%NmDA](.D8;nĞ(War?gwp]nWFvrZum2ɩm _ QM k4ї>T#U5Ь*pD߹L3WO6ꅡrנNP>ƓSdp_zt/Mp{Ey 1;1 dic{0L5qP+8%7ɪIےr΋;vnl@qm0pIY>.*jBLeE~s|[(x= px} t~guKu%jmQw7$Ӝ`Hϭ}-W'.Qqv?'݃ǔ4H-Po4QL*Lm7O`eD5)JM)osP֚jw#.悴'p~+vAQ*: F0زN4;?ft2Z9GNryrtq'ǶQ{lW%X1%TL L)44!;M"fNcʹX!H%,qdDqtq)RabQ,ihio?àkr:RptwlA t6Ycǝ$k F?Yy/)4F}?.ݸxyk|*|1e#{>pgg4Ilw ]Iy0;dh<\nƁ2& 23-q?'1 H `$gfQr>GҔLt e yDI?LN>Q/xKwϗ21?fV!w?0>>ͺf`@̓iDb8GNjb7gw9ή%^@U{p >5\2BAOAٜo߼8}4DG0/G-[C$"]mp|v>CibeO?5]UjW7??{_yi/=)UOHDj]o(Փ<쉈>^ܪg zEE\3jS Qp|Mf;ouՏdKwocgo_Lg`#u/a`2w#Zz.ɏٻ~y& &I.c f.u%dzv?WseTgw?ty=pe̴7^?guoF|zfg% GQzv<OǓ,4s&72./rܡ^f+A/*Д};Ia*/1hхg2;d >qMv z# 쪽~D !msxͪ\}I̕1K(Dօ$~Y  ug_3Yx?:fb};=|@0Rp~x_z9/WZ#~e|R]7L}e.⾬# '!7:QN,DYU:eG3^ߧ8 1A p S)B3zkS"͟F4gaʌ-y>E83?^;4d]l37G=o N<d ?PNU@ᩔDr,)4LqbOf*L#$G#FbQ%hD%\:T<`Xr~`7>>Bo3@>7ZOp]=t/wOXW$|*u/hNfJUC(Q%i]=roZ_f#jyW˻ZZUaM"U^ŌFI(FyLRD#%OH'k݆rl {_d&α y/I痷/}xy' }a؟?%8Շ[JG X(Ho_ u.:p}edq'W^t~.!*.X5w_yV,"Kم%x&OhD=xI q a SYIqHd-KE& SS )B,SSl:1 8 $1nRLZj.RHxJ,z|JE*P*a6r:, Nc CpEY;hzALiv pFhPYIuҲD .P?j?|{]Xֻj|M@ c 7C&'镮> ښy5^f Ĝ}41 ]*­㮖'%p (%1Bڧ?-vS_(Rf_lKVW >H\'cwm^os2\EtC!hRsJ++\QSrzmhmimhmqQb xJXsa!a!v dJg{*qsAl#^chp.czg H Ź&qnm B-.,(EhN^ˇ<%/03޵ (=FX3t=Z}>`-/EJ*LN$@CJ\~dnCki4u"N<9uڠI3@B#SmZP]Զ,,;{Zw"j-O!\ P+3pSPK2'K➓pص'xmzz-9)w=:u=hpt#m@듆vk# P]>)j;uaBSj!Ȇ6ܣymk*ԓ*)gToS+מ'8Fq⛋c23T&g]o>>~r6}wó<LI HƩtodzQ"RśFi Uhg:SD1]Mvbl J4*)/#5xbڤOٺl5HSp޻l5lӾֶ"]~I٠TB&ho!&^-,&!{XI֍?a) :m:6V ompRڅB)|teQºHˁҨbRXɼp%iu rfI[ 5%m6 R2CdcjX%23XFDZ8x>:PP.TWɠbB3A>z%r+cރ.m˵eviC\gfN1+,`%8s2 4\(MDPQGڦ%8:ȼ|XZ6SMXI3>qz_t8Nq䮊yK\>p`fkGqPcWsP_(cI\/J֟d DhC(aVYo uޕOW;]̬v1e+=srOKM9{ty![|5A{/8!k7g W Fi:o_\~o䂌6S[zsݖ6t #-um,k]]]9MQhB8}-]rEVt2k9f1N'f)&1EGѴ_ϟYBa{*7{3w4.dnåG2 "ct?kќd6AȖpc HGT=ᤉ>.""""]Gk 97t P|{ǦCVz ] ]0,i*i|[xj{|qBs"VNZoe,F9M>*!8nYfi+΁9+Ec$ħRO_Y9;)sA53|-b`Q¶p8_H/ H"("ωV>P=U<{2s, #OV+&۬EdPf(y{.PouFҡbX=)zRbI7V_UvnBM)&>S}Vcp!Zd(9Y>G" 6qD7 f!2aJO7-`g#w8On4 )HP֖DURWed9v<. ;>3(eMF 1.ی.Q\7zKVphkY$FqIhE 4T <2|NDJhk #MʁRTd-Tc68tYGJY3T' #-(mλ8 ).pB.U q# J x ͒\޽dgg[a:yLR g:P8i_!K IԆtRmH|s.ݹS8p\ 8aQ*4ɜ&ʹ"yNk\8h9Zӌ(A9@ɓGzĽjïh. ;U!Wq/U :Ds Q0!˿ 3(`c)"&X?Wxݴ(':-EqfYdI%IJ$fImy-'}=ieWDn "|ov+ŽRbdCC8i` ;/,[C,BF&)E#VI@ء [xR%*hl7*qNlP5v륭4@_/d pwEJchCPdVy3m@مbh9)\Lsq%ц Ѣ x3&a!ڸֆjDZo:Sda)PRk.4,V Juj%N-g=8L** jCQh&ŭC4)%¶!lk|I>DIf;KvHhRjP|ZѸ}C) v|2_H/ҝ/uAX {BT5P>9PA{vOjvsv*rήUsq-ִ1F*kG=r8PՌ{-3'ɹ}bԏkso]doݙ.)/#4zP|OUPtz!fQ{ G K92>g8ڝIK>r 8erBG$[ cԖBW><R'9$qz)&rF`W%gb}?x?xoop1=xB쥀FɌ,bl b 5[Ud-=" EU TaɝY\[Cβ`T;Cn,KR;,KRUjcv`X2X@o27W58"Seb"[f`iT[GG7MqƍeV-~SXE9DtѴ]h]Hm}B #Kc*G=7\ :޸*Ȃ;`ªٴ6,b1 '\j UqtÓ` vSAYeD"TH &xu5\|KJt]ױk uȦ !f#1ϘULj7 $*-䚊EqS@# 61ՠFP0Bk:kaлm>\<6pZzSC+6ߚ]O `FBAš +zmA1SKJV˶< K%gmEITF ҽ iD8݅\kӻ<7JѰ~=#.(9M#EH"$^h6]tb tΣJ+X&݀0էd7T>'niv3_|f9O lkZ~3GQoXI!k_uW'2kRbwuw)uAօ9Kƺԡ ] ZՅT 4U]զz`p~fқJ"lVrXJ[J&ND5AEb*(y'פ(X-VEw7:g<><G &J@%+>^)iWXTR`.giʠp~ܬBRTgX(xr=Iai'U:᷏7s{ȃUj{'k?b(kPS2E 㘲eXmuʎEcWtjYN5<܁ ԣ)!n+lK @@ajW/vNBUR,SlES|~Bf`wNxHey_6V\ńWm>ֈRWxwZOcV5t#P'lHK*ưv.*k2*%\XKJ3qWs.]] @kf ׺? ^߶BE.| ELas4噴&V$oyr EL4*Ƶߤ@S?0z. ׾vA {娈д0Ez/QE,'d[HKt.#RH5v moxQxt౯ktht2h01Gg=w?x?xo*o {+HV i*M 9,1ع ;t "L<]e{`b0&""K֛&bXo"֛ ƓL7MęM(mŒDT-[Վ3d P1PlJw[bJgg%:t{.S)}p,1EI<viPDKE% ud9&| WyvlѰ ])uL0Tk #n؆ƹ*"东1sQS)A]U*g𚺘8?; O^jD.'+|P|'Lޒ"8.t.?;o5cx݁LO;je: wxXRcs/)~J1=Н#dH7'/-Ua*xWby8,lq"S|3LfAͰH󪿦ƤT\)>!-D6} 3neϗ  s}9;XVx4l@d^0SύFKh kҎ+يSV%{VNjmֽ}ǃ˙-oI-\Z'mPZ-R;l |QV/QKz<8ɔbr31Vט+Ռ@8$z*ao< dM~o(ི mf^=9;?l;]~\_> [|JNttJʲKVWVd}u>N94'KYwo_]޿A%<8L1dSϖsR6P= &:j!z扉6'D!M`%۬CcCC޺oZ&;H@RltM\:Ppiu2ۖQ \TaH)e`GfBdYB.pɠ:UKi^3Etyi95(5B][c7oȡ3n$rPX$$㏃'XtHZ\̲,14ٚ+&UhبVt\YrNsR7TE K(lLJ yޚLES @2lB,Ti#D?P  o\j=) ?#Yl0]7~bol r}9 NKtzHQ9f8Y Ȱ$zjZ-`>/Kݧ UbvP-~/rzT C06YV^]=0]sf* |پ:'xu,,s9|Q;e|_>bh mj!U*;x> uY=xUξS^'W+uj  Bv:zvXU)C-b'{vrÛƫ9B(;pڂx+tgX#^V,,+P(ysP'gT7Ph@ݥ|g'ͧ`UϖXȶ ng=zB&iek˓XM؈RRv+#Ae=#<  &ÏT*ȡ/R'Cv%)?1ԗfSߘo[{Ak[_3cAs tOvR Tr= }ut^zk,egTk]E6Ac#61j2g~?zh 8 YJqBSRQP>%أKcو*kie!Z'fQ BmV/_J!Ji5e (FzDʵ P/J^yVpg]`U%f8D@^T&la'"g(Y< 1C ZR5Q@\#!M(6^ _l:eb*igeʘIUR;'c1lq GZC=14{q<8ͧѷgl/"B!ڀȋ(IhRWw2s0%E9šܮ@>ĢህCpC=̹6i9߄p-M;6t֌U >v/EAC.ƃk5xM5l10V4;g# |ԭt ʄɻmpD}cf|/#B8Yʿ^6PrCs'5ޑ_=GMRf&Nk0<&\& -.ԞqD/% )#;Q QUaY]Cvdl8T N67"L^R_i#^A<Oza3$/ϊSn=+Url3=;Zڨy[_F'L* ]&@d;T Q4& &' L6,ˆ& 3#`z'r&xJ>%ܮ1IbZ E3&M&I&W8},C 1haJzǡZgNscC(fNM< &%ԠWi Ax;k-! S>vR׃ E Sj bq,0LV I64(Y d#L 8`*k!cj$lT(LRb6RGL87\Yyҡx|?UBUCUU}:;\:$l_ F BYTfVU&™:]RX;-]j]+7@ _ni!*-|9eCjrj>!, Ǻ'oY踔SXq@ uq`;\$.W:* `^`B YȋNH%,9xߢ59(YRmvr{ C&cXz: '1 2ϮB GC0%D&m!4v'/"`{vʄ Fizm]H7ٳfFY9=ٝ<$E5OК9`7O:&nY,X""̣H ּ[x;L88\ ulchރx6u{>"k #5d것5֨a5;1\8kx.aMڸkP l Gh#Ϯo݊5s{#l|C10Ͷ]g#DxfelIPZa5*]ZA4Tv dH)CTMK).%fҼ9*&M\ v#O~E-4 n*ݥqBpO=D㐩[=iqvԘ >' 8$[]`ws<8෇]qz KNzC= 4,8-9rҾu]).j%M ko]<$)lv"f @65[%->]VX-Ul ɋ}z_^P2w$I8<]͡AǛ3/+ ,2;?˝h-Q9vq=V\0jӪW|J1S ,Seca; YƕWj>FNQ\?ˆSĴ+z#xx]M/c|mJ!DxbGߓAI*/OXw> ہ6#] ꄂhVglړ`7MWq@uDLD.D.RETAƙ|*m9Na=]R4Dp[.}'ק?5\G:M 4df>b j PÁMv*Y% 9ȔCUBtqZ$I,3DJ֩! 5v"?YaWz"?T;` 9.Nmĸp}CiBAdF} ⫶_mhm{ж -勗_v24g/#, U-TeCڮ[~}=rvўZƬXKhI P 067NC|1IILglRDX$MRS&]b!8Y$k|Z5,)"|l9 8iy68QZ_ Pա=q}viM0?}|K|n^_/۳\Nu >)T~u } [Ztsx0 |=ϦCRvHQ~PD(wvzS] :цw91 %'u]jeU]r/)C.&.y=Ӆc13/_rM+'MF˛|79.{;xsv[rc1'gwt2l9g'_ _9fFio/¿AoU^*hWмln˰T xYkY;m*IP[4mtFjtF6D9TWl"rkY)FDXIuGDgR%dY lO>oM =}ӖxaKã+3vpF`U6s\Y%WOu27>s3-dyWm [Ü^|[W Y4i;iZ}賴[$4hK39c̚Tp8=03>:h|n"~>s棵1)0thBQ(D"$jU8)HhSVX N id s,)P1čḘwd𧯺: p Q9D 0#"gA)'2=|I"}s nA 7/[ZC}eN \E㾺B]Z=U*,##_..7>O瑜Uoq0~|s,?-}0K˂NeDŜ&Eဢ;Nc< 6*s-pMZޗLy>}Z4DIC^)k^ѱa KnCyEuu{Nw5)vnch+WuJI#׷㺘 H; *ډ1 &S܏N2/cL9Ke N/n5kFm^ԏF+J*:VjcYENir*Ei+gIDB! $[3-^U FWyeU`,"$M6riti `8К$9kD34g6+'L׹6d1W~r&S{;rUht\ ln!Ж ,Mv+tJ(dgdHηsK&FfoC SF 1cF Q}ZN&e~y\H.>.:,]r_y]=?oW~[NYoW^[O_^ڔH-ҸFZQk;e>^ X6sLRH %`ʭX@qAz>&ͳ4W洒6R|ijdSۆǶ^տ;Z[1@$|Y!dpEaDѻ,6AVV<#&`t͖wʈ9[.<[.^4p&VY;wmIn3:8U9bg?\Xe *__^,EThV^_w_ݐ 2tНCwZ i5ݬ҉NE#xȬI *HR$@訤(Bݑ݆hЗƖ 旳h\ 3Y8m:S+$; rW/&e2՛Ad*u83O[Q/=**e8t\?^bאWF@QMx!0*aGoǂ1a-0_+ r> &Նֹ!*T?x=$x~g,A|z_{0.Q`iBR\**< `|y[?q\}EX+M|e a.XDV>9}g]?\,Jk^E|vCdC99݇u(G,q^['?6K3",=s햱}ֺoljV<v[1"dM'wS /UJj+ yW<f:5s 6ȐC(0)Lz+ 60+r0ŵt멺[y3V 2o#jn^CAxM0C-uR~NX ,Sr]1՜B@K_os-Pi;wB 7gD~ݥm5\ѶfYfSU1=ʉx]CY5]u1.jgd VNe_[n\6&{)RS,X6^KW7mO f0CHnf-9tʛgo~Vw&y+,u/R5Rp5,`QMߖ+wPQ?? ;y>PT.AdzE|Y>m0/`VV2 A6|D O ~)>|)f} 9nk{$z;k}ގ4zja\)u dܳE-әz d?6wj./;Yׯ燯͂rlknb>糫r4|>)Pe-ԢIÇqz,cIvVxZ I1!  %eA)@(k."m/;ͤAA^OhY sF~V!#Uz82Ϗ!FW}ms}M|7+rQśo/vT![MI fzQsA96 +2K6#t|:]~Z\}m|Pc/o ,2FsC*軇Bύ@ţs.$a.9r| RmöL}S +rMR4\)+cd@@ qy+q gϲW/-hX$+A2@c1s0`V>W {@ c|LSx2%@dQ*焬3"C\K& xRC% NFoWD3Y4 *-z^WFRȕ(Jpky"hefDUȏ- Av.OdsɀVke0Isyb4eiqd7k5{[J nWQ(d;f(g(TǸlY E/u5{^2J `݄rF&T3i qPda>cȝF>O#cmn}=À={Q$XV Ԇ#ޖ̖:Zt SqOb>8_=hFՔ:˚*)F}t;x7itN#AɇtpsO'^5P* vhea;M];b~c2w{ o3]Q'ܑOb Zw̡2L'0*r/3" &.7&eqr nξ>Iҳ+S$\ \\y}\"XM`xRMJ:DLG9Biҷpu4=f 0r (U[K~Kͅ0H%7+k%rDy:K-ytz\Zh$1%xInسܳD"ţVY۔2c~~t̪< o!Łz-K7&վ58nf2€y۞߂c~k' K!ۖ?&:yw.BB_ >jۨKJFI5cXHp"x؛p}v@#G^V+mGsJ}Q}*FeET*T^:)0"?ZC竍m ?Ƶ$c7Ewt㠅ٹ+s}L'=RalxAK2A&; ckXM~raϮZmZDtarYk=yDJ(YNkJ6Bg`&*AZpݳ6˳#簤marˁu^]T) e ֤+z \I\)qLTE6&&7D68s A.=  :B]4 SvQi/C^Q|Wo㺚j5}lj$p >#8ǵ vn‰,1-OV8?L ccB/ ԎVvJ\hqPqpLtae{i`2L7ҭ;k:=Bֈ6җ6"kQb!aQe{uۻ[|9aZ}k(B朜/u7}սӢ{EN6uo.+-J/0rp0]N.i$SPҥ4qaoe=՜?aY]`h+wu XhYFJ/Ňxi y|g% 7L q1V_+AVAOKR iF[:ym9PE(0bZf?cn3IɁt(J7˧ɝ/>ڥwrޱF jN0t#moaب}ubs SforliېGݝ;B yͫ~҃\i8Z6s3&x'mx>^$#0-7'mkZ6&s.Kݔu'>ΉV581._I+f\u4H_~ ^d: z,DѡV SV1UFLq3&eSMtZ \>3yNZ7[ntUnOB#YB/Fޗw7=džry2@UQ"Q2IɱO5/&pfa$È#q8]UuuUuuFX t3KX 2* "`s 8:F"Mb_*⨱knk*E21wKSQT`JHdSB 3iKM  wT[,Y=F&Yôބ7V(Y]²iQsE_4e^p0Xx3 m/Z+7*n߼~m7dh,;kaQ,1#UTN Gfjsxl:hoL5\N1%v6s@6:| dAI#ᤊN_mᤊ(Nl>@+T?k@=au{0e<ڈӆʜ.p5"ȥ0R ^(ՅT i#7Tk&:HIU A )a(R+m8=Tc <93^}FC5;Z_`7~TǗġXlc yG9)z W5oF~j˻ 2xɭ5[; A; ڞN0vT;)?3|we9G Pp??W ?=Hppߝ~ VFy`TξM+'kp wU3*\^N\*qS示,K-̇MBCmoL6|χ]:MSTOgoV#;?zj8rx^oh+pGΘ-)Nq}~^JfoeNS]NnTzcuF4ِh-kC !h9`<48>OboWpR^pv_t:uڒ{ShYҢԱǷ(ռ~~8}d >qƦgr8pe V S#ͶK].χ_E@R zKUO'Z>ժlv2\5% IA'r`_ fށz{ ___׳_'҉gc?IPi$.K꡸=V v݈uwRc1FZ,Lٽ\k`(\ ߄F3 X?>chtu9 aG/]c)Zx@k X0c2nRǿv c*6AIŴ] 55ǻ?|ר䉶++Te~R+ZbhrP@N:`[nE 2y(5`B#$ SD/_?OFu)-gTU;bSK׻o,m>#@_^xUWZ~1laQWXtqH]̘f坥̣?ɣ`x~7Š˖1N9tۋЈFT"/i3Jiv{׈sN4'N\0_ ѝ}!m/ PB݌WRND3sylA1JFM$ܣZ 8Q j(`T‚9Ys4(-rGByqWIT @o2Wޫt`I?Cޫůg_@fbkve?KBNk1m-rWnʀb^SB6wSY:2{KJ^c,-ۯ#pASJHZp71$vY)rhST]4L7"Ynr60FJ*𲈪sG w%8W 5ܙŌ;Uh\o K>sN:o@J,߷r꧐V?)/}+KncxM>X#!8Be;$Cvʺ[Yws"8[/\RuDyNI:'u/Ӳ([awA&R`(7 $aÐ)DRu\wZ\wEYkSe%Z &JW;gtOt/B/I 84@yΈ䜁>'P!橕04,"@dKt́IthG\s+GˉkURۍ*MJ<\ d3X)1/zXNsLZuXi^;x!H1g,&( \" S!:u.[.TuڻƐ$*b\ia&3g zuTZ,Tbε)>mNB@nN~reYBogx$񉢸"ڋQ6"3rVVd6T(CRN;u9aLD%Vk6"&-JHnEtrڵkWKn0SzǚZrVD_YoD-l9!TBai MD:VE`(Jq ߔ ܜ . QlV^m kIf'W3WAJҬW0e[y9W>yj[ޟlm0dS95[VF[-)W, ;'<(KWK`,KɆjT2%8kA۔A]3{^Ał%Zhia6L*j2<:M19l 7ASUlb  L&, n}<-M߾T8`͋ &4.,ݢ2vmӯqX{[<\5+M zVP[2sٝk(0o9A8RMwƅTDaG GKA1mNt3nΩm[TlMjKRL/y/i;RN.dRJ;-^p}z 75I%+Qܶs&5&nZÒUC2YaB""XraIZ9!R0|aTˈ9&[ƘkحCI挳Q8?"YnYcfosW)Oұ! Ñ/є7΢]ijwMRIr"/.xs mr_h)={v-0mynkxFvAs:S-)ׇ7b@6^dOL6g?qvË))| 'u6@^a"B3&'}x \Uz K"EA3|Vʤp {'KJ"AB*Jo}x%*:k)@ ,v] I$hpTL¼KgtOgt/+KI,N:O6XK)D!%TBíw$B5J @fW;PRu[ cd( l4hs6cF:S\냐^iI($08P@4aX Mv5G;0]7'z-QvHC.e7 z}uh;ty%W`J;9rgy/\B`n 6a ȇqN.YgAe=y` >^_^ oE{x )p,KTfa^ 6 5ڡN~ap-v0& x >Մ#B$Jںnj޺7?}p#*ͮ{8 Wh$aGE,SNYVV B\[әWt\VtmGz M\1k"T:)Z!rB l1$t#y w2Ӥ$ltCgWcr3ȦSeuS@y͸DF&7wjwo=`3jp >o7\m6;3.x>6ׂ0:c~."#)Ϧw?dt;%Bor$"aQO\ݛ|ɐW#{wMKιK\yYRfMu2kD@O+g]ZZ(> ~2F!W-ޟUG饝Un .\+(avZKㅗX0o}LaaMwiҕjztaŬt9\섭yiOrrҲޖNuS}.ro)n=θanI%XuvH>{qRli =h> X۸on<8_f(HeBr(sN@˗^9"oqC~oXp+z$*`&=j\zRhI?Cޫůg_@kPBһ!kf2+9ߎ?*++?ŹY;F&]l{rlٻ6n$WX/]]rvve}* (x_9p^8tQD44hyRU`[`=t/)A*G{@bI89mΒPEc)&4ƤN`͕σN4rd)(NG051J)u7bsVI$Syqhx % `./U=ȝL ߪ_ ЄbkPm+)Kek։z;P"g73߁8ltX,F[f9u[>7ZHoS?))&6)yFnJ{ɔ[hVϸ[1xC+[/E]AC.Xs#ʒ={ن~mUQo"Z2z AkԦ0b QB/{ @B!{ӨSH!K5=u 8!83X?,_,F{(3lzus4{hQ(# z'Z7`!kzh:8nu'%T]_Mu) 2?/$*W旷RTBQ飺+gY*{פsA(B[ݲ I4>ESGzpzȑ^7ypڸw{wti3Noݗr-^~LfVm&g$؉,a$͸`,C^bCkgû#.N=^#[pG5LTF.hoǑY?jń`&%?%i#wJ)Y-y F]~9@`=e_ͶVýeՠP3}6 N,=J4ռm^'ζFjB#!c${IirA- 96qt* +`X9"73ft=\=l(\~Cً;31/6~ͣJؼ{"bL$Sd< T2SlgGl}}F0v~WwL`o'ႬJTO )bf7sNr%bj_\]ׁPW>0>Y.E$`EQ%uKUh;>ؓCv.ۤfkt'GA;T@W⦥!]P]IW%ZP]չ+lЌ%Z f||9PWv)٩f)"Mt%d ˳D9+`鑤U8Xnmܫ9EL60yw:^c?xnkq; eo{5VIv)Zo7[L"+h>+zB} I 6I\)_]7W4%R8y/u(h Z/P >Zy)59K6U~?V7ݯ왬Lic07?d^=q>k|,F7Kڌke&M3%<)l>\I V&ݎCƄ"?Fb~%ʣh)yQt4D{hh+{'IDyA#,PUV|RY@4SD4-ij숅P7A3OY9@b\ˏ7ő'ЫATA?mVeONL%삣KFj~r^"B\)erLƺ g}[<&ُ iu!g,&(ބo!DVj BӉLҐ3${Mv4"hE^6-:#͐Ty3:ZTZZR693BH9>= sћzU~=9B5},C\Ct_ /;GK{*s|HH^B>ML5 cnBSvrP5?W4v& i`\7Ix9jPB- $`#r 2r!`ڢ`Mi)eLZf:DQ4nĎɾ"3b97Tf.ަ ?d l\Nxv71q5g 9V(&W`3HN6県kë3rŻ,gZF ˿h,tE 9MbM![34=J#2g ̿fVXQ{CnG@>z.yޢF)R0jL-f9;.S9]֨")Tc̈́v3a˳#ky_2MDֹˡj1s../$ V$n5[GʭXRvZ|gn[s,v\:rRbRs`x'O8Y%xKD{Kuz8'ڤHxpd7 8H 3ʏEYa3fSC=8L!4 PIiB qJՂ@/%4,HV^|Ω*o1rj!5#Wm:`՜_ը2X,q4,ĤB3woU] 7躛0?%"򋂎&bc ͎BBwi CH49:qJyjdcϪ?+8.a.ßkEƋA/ژ?KJm+2L9) RLU{/ 1 &9/jt~O8UhL񓛹:9Ȧe'V0%K!^HeƠYsHSE)R ZQ\&. BQeR3P-Rh*,q%EJi[rBE (75QG`SAV6 3, żMX*J$d("E fq^8ɤ$DOсN:0$CYY[$7 kBP]ĮwK mZUېk8INO }.Ƃ[)|%g}&8c*Wp]>ס8YN}rN٘MR\o\{`Nc &j.koaoQ s$)f J@N$ /ѽbo~R<-NVF?S\W S ֈ#N%@85`S)G*Vȑ!D,(>`e3m(XaS00X{-ޱ5jui緡,pJ%`Jm>ٕ/:|h*X[1;("Ȣ*3F%lla\5lşz.g-ǶyꆶS%mPsGߟH ;wL6}CGI㇘ŘǖO/34dځ*@)fMʄX)IJ"𜦊:OIM{nձ4ZS~@o~z ˸y a#{R9;b2I`n^rDZX}B{̮bm*q}k`\/9,zo~VfYS`= Zqwjj|HsU.uw@lˏsu; rLUOݶ!tl5͐xk:n |qϚ/t㐬 Kкph|p5a+0L`xn:V`Mr4/,}P\RLK[M6"]]f "NDǡ`F^0X3t~UQ?%ryhL1G"yJW.G~mFts8Bz hW3Y-.}zXAd.y=/Tk֠P!K\c"rG>B#Zk.5ή=_}іȒ`1AmI|c0!+YgȖv3xb5E!S"쌒Mn_=U disļmL&LLY,|s=1rS׃]w lKcMig@B.S\Zػ| 5&$0p\CJ4ebKΈVڼpoZ5#3X!C xLo+[&:SA+A>K2pJ?(%AS"Y}U:+[a(]Z7ޮ ۔0#ƣͶk0sÝk^Rww-רxݑŽьF sht{/Ji=;w?GC[&Lj&lRrNX Ý;J+֡ճ m JG!S}8V>Ȫ?૖w!t`&p{XXS]?X0*4QI4{,I\zHlw],&ҭ9~!_V)Gzg1gO W%p?fu A\gٵ[k-5u! J5ˀ.jt n>*)8w(Z(C6m o_zT*/5ŝ=t atB0:K#S 0E)xU 1k&1t>*8/f}Y&}7磇g:V?ӛ۟?fd͍I'(ܸ kMiq`Ukі3X䯱]ׄNw h6瑵E p[{ FXk~vkJ(\'v%cB\U!e=)wQ ՚xT敳)'yUp%!YD9Gq]ۦ|ܴ?T pZARd>c*{נsXulOdٖ>،ke&L g2iSc+™y/Lk%Y)FgjSxq֥2ddaTCm`mYEnP#A) E QRrca!B)ENM, D-tqT+"Kݞ6$l>v| Mۢ/#':Okr$GR{*&:^]ݾOb-f~監Q>LgspWt<&LH 1&X(N%.tyՏ߁)IQv@0u45<+Ϋy8,y"$htof}Cܭ|PSƷ6q9aK;UG $[-y V;BUJNq$BXCGצdcCaC2@8*ئ WPժ&[EiuN'+'>{[_T*QݬaYK$>'QRtٕVaUYx^6&n\6jG9}pf@zb%UZv|! ïx fpr-/Cpy[r6CT`l?p,zԲ [Q__7Ǚem*/m`p2MLSf(q=,Lf%` hQR7=CJ:TKF牴ijJw4DICrwOEF־.Lf -Ma5(ɻz5[6O[jMܮry\6!^'UlV>i'h'T ˶FJAnMZ5#[9cfFRkA9 Bp1d9C$P[f7Ohq4s%;mNC5Q*[JIm2@e&4BvyW ږ-B͕UVN?T@jhSކIjʚ c9&"_hs 2U;I6&qܻѐd)CQګgm:)mQK\6hs*|[A}Lf}бo8 ;t[6 q;m8$Xտ@z\HٮM}T|ܷߵbCI Dh]}*az>Ch.fC6T+0Efd-;y/ѵb0cS,ջ5ˇhkLTWQ^hH>#=.C6)v 1},P Y0U⭢Pb0밟VT (a+a3a %eeg=!-|ΙLЇC !B\Ѷ7sWZs6#H)DS5\W I;mBbe!j:k?ݺAMZ(rvXQW>t9j&'c{2npr[4l>sJ$wrnp.2 'JcI>wJϵ@i5}HA4gvlpT([E迹 fr{Qg=MqEY~] 7~)d>v3x뻁|B,s>Z|G!vtbÅGrOvi:)r(w7 G#Ao?p:>NN0:&Xc+4:N&*`11?_o 8V6.߸n̽ϢWpbQO0|ܑD1`x?%ҩ_HeaƮ`1~ŢBF~Aj+)ӂO[gon|x@rp\vqjÈhG" c`]^3pW1 8LN6Xe,Cp$kxI0O2JWfg ܼ>,WXnoIܬD1'#rh0b'l XL"26L1BY; A 10!1 h SIS4RbG=zxcO6L!ޙ*Ld|[LUKt}V9:cm|$}k:+kZā{8Zu;2T J!k&6t:r$Ogb6yL_7G2]wVz.><{)]"i`T7c-ƈNN@I[ vɈraXuv5΍Z lۂm6E :zG0FOj2!Jbtmu֐ݜlt 1szm;c wTV"8z{i}M̿(Wъ˴~<}^iLuy.ol bЀNϮ: t %;8΍#W'?G"#H劝úwJI}}OOѡRc֞2W|)>i$bb1VS=\2 ۏa%Lu\L{X";F`[I-9jt C fRp$ a 13H9B#+i}pSPM7Z- brU& #T263JĚˁvjqQƴ3E)8,`-ú>Jۥ/Y-hNӗ$V§u:r?4VbJg1oPf{Dʗxa/ TC3JRxDanޅi`nZxXhڟ_\}I7[ej2Ϙmdml/g%xRYQBK&a0`},'ANf_6 Bqh<.io!] Hإ Cm@a咼G{r |ڥ㱎D&<b|[8>`|EĴrcd&S:8%!Fwv1A두]!1DC,F]7R~E$i!JS B@ejo~Eb?["禮:լhN~v< cSօ} XJЎ1DPД tE Œnp쒆y0f3bL XksP)#VvMs!+][1 `mp{N@WaJs/'MI]Ĭ3aܚ͚RODib:F!T9먝̿]I?4b!'QsZpM:>k:*$ )FurYESZNpbD讎wP1Z)2hRkРJt"JQA4Xd\#Ŏy&d1%ܬW%RXR+(%iTq1FQƼ@R\*)yb +TI)4 FQ_2c@L2rR,i- Egdc*V*o1 R?zG5`ڣSЏHT݄ppgPaI;CҊ%wMKTw&Jfl4&~dOSMRCC54) rRu"`ݛ(i/~Fy<ƙO/.,^f6Lg!fc}X,>O&'+r˃ds-2{"#m1 Aѿ|>@Bjt:X堜[@FtP fwQ(F6+.Bib4ZfK4'D}ǻGgY{t%`rm(Xgz㺑_f[j?H٧]&dǒ3-$t봚E݂#rl Tu0d%Q,<*lL\D:j S3G}(9ڠjh_ s/Z|k>,Q[$, XrćoU][*[񏕗GF1v&-Ww@90'}MZi"iK_-BB}bS #¦¬-'n(fם{F xc\R1gqad2bY%X&2D1+Lܥ#,kXRƤMdWO-4*, 3`!RʈZzJɖxkZqekSRhp?6OuBGwkFKvɔbIBewε[U+zog?b_}|s.Y XPyl}G߳]; \TLV՛?(}QQg~wY>J@;n{>-.|~?__\uc}1Մ#zq|Bj[kAY7Ooߤ,)T H"is-~,ٱ&ׅ{}DVw.Yz~!T~YC5T~YCPy3Cm" ^4% IUDFc\IgL7AK+QQT Xum0Q[7-hH|ZFQv5>K !߮_F.u?|Vs5L~OuHĮO(&4{+bYBH`0+꼃kLm^^ŚˬO E cSA'>܂~Iw.ɧM>ۓSiNI&%mjT! #C4eRlކۅr-RޡǀLTLlxbQmOa঄!r7%]OykK㾄KV?gBqޤ=iK\.>zs4(v JŞ9٧vGp Zt.:ٔ-'  Ndt,%cbUb,)"Ka0$'ÏtЁ-MJT$dǸYd to0TUgە \K DIVHgW ¹uS>DdRySQJeNcsĬStu#a9#9ۜRK_p08$ yYsAk0Cڶj-,zN5If5ACg*ivn~oRhcG`-D70ֻ#/Føޟ߾9cRrV,nk 쑝~3hRQ1kÞn(K%`2lv*~!7b lN>~poΟoVY؄c]WpCGb[ cKWOƚML)Xzx@ !dh,^ )@Aծ^lw@A X0|&OލA$ SE{:rIDvaw^M`R3Ӛ}~8BcٓmF#:4{Lj("-$5%XuTe:e_XCU` bTY+o fgR &,11G09IIec ւoo-kD.eL#9mcSɜYL6JBT gd1:s f/ F=$m4||O^KoxjBY8/m$6XU_E+ RKan#tp԰'S(Po~KڃJ䥇/.`/ZLڲ63)i2҂>$E|bqhkT5GԲ!&\d˚sM9:>Z:N[V J\>.[A`J&Q-;>|?d# !PDҚћ6nl DF(ϬN(3?zELPJ _.8Qɂhq.:&8 cd UGeǰaV**Qw-^$p–}>՞Eobczo \;F<|Jt,=[(sne UoH{_xƃ2GBᏽAt+V#{/x[rۆMt{ەH;qĐ LLЩIxL¶"{6 >fAgzn5GF/.˘*51)zLѻыg2u2jH:|?/K&M`H M}Z-prp nq<ቆӧ _'qS 'i˯8fȵ z͇p)Jo|.o˞9T ͨ- zfnݿ}lZh*e#`&vA:Bz2Z<Kv$$ ڛ;H% 䩶V i@r!ȣ HVkړ,\(v=zw6Gt t^/qeÐt6y?N!ر9hlJG^ f,-i)Mtd_3lW;zQ}G܊YkLg壮QJ2xN'X,(8s `tʝW Y5W8WKތvE($c:1fvLD&mk\#xk^Uxڕq }g5 w_>Vjlތsvz{'C|o8(>hgb!~K?_j`[@6|$x)-햯wn>h GtKK'nz#&&cTԉjͭ꣎ںl#> ?}!oR;qzہ\,g~޿!oR;qIFdj ΍hkuiHh[IM3 |? N\}?:sujIxw5m]I4IVӂ ~$F/.˘*lEM'dvke<EQ.G]fލ^ܘ_E!(zgLѻыx:hgHg;/;=8L΢Y1Y{9!k_}=+8N=#Nt $ ^Hp݋NewIMT"!F#N .ea!ǙÀs=1S,Kl ISvᰵ={LGd/2#HźHG"*Ea]n4J*w7bsɜr\`MFQ>\Bꢕ8kpO]t*Pb40%FZ\$w@Ah D(h-I(w#atRݘ):xfièHQ gj٬HA6V.iS-F! 4a2ݍٜA8!gvb1wavrr^6Kz>(Z%K@+xbb )ZxbA)o齿膿zeK'ϣ!5,{Ҍ^ :+$qeիUx.?fXe~gN>ˉMGڒY.fiߙ3?_ٳJ&oWXhA)KfY#1|(A&H N6PۛUa nr " >iX5zzŚߡ ^Dq1WԂݜ~Q] 0SS,L:?ZXOä7) S[5Q{^pR͖ʻ} /B}T3;Cw saݍdΝfo>] xg3ZWÓ>*U`)~ُikiFgQ 8yI漤yIm}^RּeL4 fy$fIٷyƴTt1Q񑱓=Ca&!EH_U8-ڨŅP@8>jcbX ߔy|YQQ_35+8FN 8;a"A)OEFF(|@%C2E_FyJE"S0F1(e d9^Lݨͩ XEF&u昆HȇQk_x~ E4h˜X\iɑ,Q8rfdMq4>yxxxxr4Jy2k+寧I!Cb"h݇d$(<@Q<#mS! 1}=pW(m=2ޮ`)Oe6qe/j!C5jyo[uWa 2L4[_vD5W}/֞t׫~H_)p׷.N~}Ԏ!AOsYy әӧv gO=C'QԍTtӳWtq~jK?^xA|T G<{FO3ص#{2+XksZf`!_ a}>._#іa >,LHLn, 6% ;OHEG !]_[\ef0#qqq:R͌ܕfoH(I#FQc~PR01A(%dO!FIA0)jѷOeAp2ҔSYGgϖJɣX6B dhV|=ٖeQWZ߂Q0uZ7d_(͋4Ss$P=o\sƜ7(g,IK08!D )0)TĜ4q,??K߂?ߎ>2k96v9"@a'TQZT_QqnϷqҸ1Ll{HD%k@][ƂiNڪr9%cT=$SMoq[eznS}YVHWz 2k'yA&}mr۬}:yO/sϖ9Ǭ۸Q =N׸u^>ӪyZaζ톿-!: xϸܴ*izuTѿfee$c{<{$:j5i/Woaz(.V>~xx /_^g}U[ՇwW 7M Jy^,ϧNPc~gKhSplGP- 6 9f1=Ne0^gLiu{34K[zb~? hD $R3½p~b^<ifr"ewuX5jaB1ptܪRj }դ1"amziqE("A%!$IRaHpdXT$\d1 3Nv\i5@GL볉x|NԸw9ɉ:yV7j.>BI߄,iyTY~ MhyZ迿\5NM2nȺ#Fsd¬ϴ~Lc514&˚E)R>[ Wč>|8vlVj8<}d~0~uZ"ٶj84O+vb n<͈ !mȽ|:׈ZJ!Kv(qXKbsծl&j0S{J'}Z 3|-BfkIfc7IC,I@{,?8Έ'1s6𓶷quoݪ* %W{5m9QQ`n, 9yX+G5 V%]??a.?&cA@#+3lu+zs a-a=+ra8pJYq < GGypw.0އ3xzHaGvqFWg>DŽ\Afb7pq;lOe!\]D-+/i}~{lX 27YC;vX {K®%%ԧJ5Ffy"9@ 9Fi* Ot^%h29gH*oBP 7bwkLi %mEn FNKd܏jl>A&d֯RBkYM9/4o$8oD1"*4D9_7.Lgylq:-T)pDd(SdxZ"{5Ĕj'ⶆ*Y c@P. Œ_a#J6^|WQge^[" j*G<q-\,fæ-' ' Q%H߲0 $M3HX1L ;Tӳi>\vD-O8xPO)qr Y^ƚ|~MTFZ*5.vSXHz6MŔ/~Ȱ\>'\19 kS+w6}CJ"}.ZEdӊ6!i%J⌘?M{{e}~|a‡wG_t|F\ m.ؤ B#qx@܇~NWe^Ɉ>&B P(=q"0AQ@25ˆ b@H,yvx{peh`pct͟{- 0MӀ0 HLx )$hㅚ*ƦܸVʽvC^M'/jWʣ.&'JDp Z""Q UR`o;7LHRagB]OE #n/y^IZCJ5 㾥~l˼=~4|/i*d}PϹ|:y+tQN ] !ru`=7Sx+2vAΎ.s$P53\0m7^t=~顝zS)RJk0syAyVWzWRBY5moxӨ+8SCb}w ۹lϒÀ)ceϓa5A]!, ii C d@$Ma^jYwf*uq0{{}X B cKi` LLp]]. %F}Tv'#M ]~a0vnB[ pB[ zwTwAg ٜۚm:\ez ?-Ma&ὉvprmYz/鞓N<_¥&sŨq q}듮R(a4rL _PH_>.+Cƽ 4WC` el_olep ܸ!WH'P;,>VPh!u-J)Ϊ<9 ( #vH4B[",<=W _<ȿ3eWr0 fa.턕[LUI:ܚk| ]Sv5]fA-!@9+mQᄧ; LzXn~k璉$4A0!u bpc۽gQ }GVcF4`E UJ׃%~o\AY VT8f z:ħ3ag.IEE\h I+FЬ6,`XFynSxpqL5s gښ8_ATܧ[U|8e).q"e;/aUB ʖS(].."\I,vz=N+4\^ PF̈́=}~OdSS3Ƚy"2kx]P=ZMmXN_lBj3n!U1ew=UͭHaV~/d˼0LjYT#յ CBV^w,, b:yN`0= f =ݺC 1Qq±R t6Uc:JDe+w[oܽ{ZHred6V.x-+Ўp5szPke-ZvĺX0 0yWg-cb;l;0?&1 XWvXx1TxyOUr\ |tp2OIfL?WF<Ś\aChhub$]DC4]Wr6Im-`khO~JB dLZW}S'mjD%uO6Koo=Y ۹s|7|-䴑$<{3E_UAFPyޭ]krҾxŋJ`m/7uѐoo_ſ翼wp4}/g)WWʷOOO<&#,gU-&G\qQ/>ɋпX#^\ ͟I:J1 F?.&3L'Hһ\lX-:?ٷp,b) kCI~< nD/{9i\gL-ch{-x1%(oLZx5>"]X!GfuT:12ko}L<Fk- mW!!n3_$dLrbrO'e{7WFz]wjM-c|B!f"DnjuIv5ыRHV0>M_AŹ߽:; k㜮LRNxonPb&10¥ Q(3:;J5pIz*T o71 ւXUM}!}LD9wVJqtٿO{ud:U ݫa*~U_?N6nk۰ dNܫ`{6}A.Y RN2nvxY/ۀ- ԻtTx xkM @Y (V3=e>NGDZ&p ɬ 4yH8AIf [jfV4pgcc%6-S*E4fuMFLDT2 Y"'K:p9e$zEXw#ItslRxeՙˎ%FV"- \s::L"D\!D:9 GOZQb@"]SRMBQ'0{u"ЊATh^; )s!Ǭ mІiV)2l J `ELt#u59-F->\@Tum6V&S*٪qL2:s"Q@4_dł(r|d"]*:x+Whh`8k0iXV*,bqn˫z" PoɹDZZ>Z~ [|CY>ixhċ<@6oNy8ΖJz4ߝЧ7`te]\.qz<닀d#%h&tX)$"Mop\)'nk3JVNlIAvR&> &}BɐO-%5,y&℣![HM<\~jCVk9*&_y,{WDkʉDo5v弽G9Q*+ɊZc+("E&,7KC"dsPb}heD,giY p fTLdIZAnlKȑ֚6Ѫ*Ѧx{5'&ɢ(+wK=F"*e&s 9uS PUv$Zz8Oie=u'nϚ|' ArBm<{rLɛ%0ljrF>Bzg^pܶ ,`~"q qN\ЀӛEzSn@d7T*ƈw'Z&u] :\ i:wRd\~j$fI'A1yM1HQLrBfEV8bRe 4"-. !wz ]TKs-)U QQ+qt߳eI2fӳFK&R}&&faX <^rMS;|ܢ7UhH!$ q J9U>(h4*5Ă< k| n}r= /޶y/7x7q! hasH?Zh~2gJ]ɰ|Ekxv&Y iGz~[qgX#RR<+G|$ Iʠʁq@,)ԃzCgZG@rc-`c؛ˑ3KL*U[EAisR[1HIV` lVmTX߳E 1iKH[JLE5HKQjisȈM#Y>zQq]nHeT12>Vs!+Ǒ&JTz1caU):=2C}`</{;l!AmSwrGzmG{iv%9TO,nmqa鳴~ oZT@U1@d[J5dY8OjV|)PC^ܥv8v)7T,յfl0PYaDG+|rmʄwD+[d?e 6c yg V Eñ \kԋ,Wjg7aC\_S>{΄w4rJ*q%TRf+^TxŠZ;pu7ߩ}G l&l4YfΚf2 dАyF&\WĔ32mHyu!vqd}zn.IBhJLצؽM4ōݛ l=u2aMʃ]an 8 Ֆ|-SMn 6myz.dhbO::fR@KZ9Tؽ+n02Z8X y‹:ӆ XiQk^k_VzQ0>)=a1c΄LFO(%}^F*btNM_e3Rڕ2t}NFdM}nSishrejLK{b5ڲ>-9bzXEʬ-9OT8LĨ_[F%mRKm0n$1]~yK aƶ2ˎ-IǴf-W.+{n-MpdZV%8 _R`:kTZ{T((? R1?8A? Z#9^~z89$X#QJb 1BI^ilw?*{tlϙzM"$X?)2j &wq!*? _C.us^n}q.WmB2ᠧo9K]pgiAu9$cD'jXfyz-ΩDEcO^<iD9Kf!S21&? I2-9KfH=-^0l9RJ^7ˌ}dM> s,dž Oe"h;^:U$Ic'YuHLok˚f\NiE9K[8JVFH8h8FAAmab;dJ-grwbmPA Dœ])(ڇZZ:@VDYS]Iuv+𘠊]3^gk |gHA߾Q_gqd_֍hKr]mŎ E-EJh|-d> qݍs^FТ5LLS1o {@W#"NQFYRKTB-&*]6kΨ[HheBOs, X! ]n9q@|K*Gծ9@B \:ԅSA*PIV?~21&dC!=>D͆wX=.L,}LPd_['_/&;g$dbBJ[ȱ Q'0 jM1@$nRыt,3/^n~8w.0ػ;]Ērِ-b DyfҾ)[5Ǥ` r57N=k<^)E*%W>[m2*/ -lU>6TfLNyjg%h!QI`ӣp!DG;]w\B#aF812'+FqXyG!>u>v})|I_HsH}$5L̡ cen@z{8ZGQaQ jyuzc Z\]48N\{mdFQӣ/c)a^z d= j fɓcN`0%鍅 K CmL*"nº~kEᙚ^&m096%f2{V|#)oZJljB8v||G?,55HsmO}>m): ;>τ79?v<'.߿?qs{"nA~;tۛ_~zy{o.~;{7+Z%uQgZ}2C0>Vz|ټ'1!ckH}}Nk34=96qv.(㸰I9Q#dJCoN\jrC8ŷZ&jV 7+8EODЎ ޡz8MWfRb /_͕0ɳ"ӏ2w?E3Q⫳0KK_On/v‡.;e"OFDU|5S@"e*` Ff҅734?=yZ_|>>׵?\ֵ G`p>菿]\CĿ(-nC]& g L_fIv/RK= M>Urbg>ĺA5% NmrBt3z3H=}BJ禯+4MLxYpm`c^v 35ᕝFbʪVfa.d0w4r*f6 |A2䩁2@vn ?k*;<|J-ru7C&tM̄ll˯ݺ%6T踕6Q!z  gfmh^v毮-j0P65v/FRRђ2f g#eKNz19C\F!VhS&,6z5DItuQh#kֶ1 D F 52M!6xl~2w͖Jw*ض4k^[}fBP:u5.W7* XAW+Գ~W) TF•JΊW{ĕb"` J11eߵ280u`9-;,#,ܿ _Eܹd$c ͱHy300yqŘw%5A0L]01{%k`(߯Jn'G{ga38 Ozſ5xVO_>NWv]KeBvϤeFӗv/vW+Yt)w_2o~GEw?mE7'R^A"7 VIn;Dq/Hsg\ɗ`ؚ~.zlb˱ o^sM9Va',_u~0\繖5Ө;~w2i2]eFɜvJܾ1ooLd})sճnE mzz_0x7nuOI%>WVDJOOсϒ̾ML .=cGfs$A@BlKJ].l>#/+'\N?K6@805 "fߺo&5P_7L}>}*PyXM'7(LE+0%k׸}TⵃV0'4PK^-9]ؖ*-Ŏf6I=Ed*d$ٻ&ljdW1 .^ jniabɒ|m.[K@0nYJ}_UVfVV.ALj-5k:m^Z"_+b(䧈ʭ2â0՗鳵j~ ׼YP. GjhĢ IhĈ21;υa"` jֹqAѐ(% ꧰j، _v~ͪWĖ C M^ZL %ZIpi0Sئq:BR+IXaFLO5'S |>JBb,4KGD^~e/ Ţ+aK"uza>N3_!Y}0njd}&5OE|ͫ4one X}Ua.d5x_p=a$;/m̿Z8% )Ak~ɼy(2#*l5Te*E3B0F;A#ED*BG uBЄ()…GٜLBݐRWu@@ GgU(`ybPMҔiBe uiq%XiX[g*H (" {}&c%ɦ>E7kz!"]7W{ ՞c|.''syޙvJ.pWy-pJ):X>kb~hWp ˌ]%X vƇySrPcFL376!4u#)F*e;3,s`RKfT%ڂґ"%ڇ%[e4v|eE ZAeCG3E왃ٝ-<4OV/\ &iz;7X4 J c*q@=p)& YŝH\;Y|auv2#TLUKHHFT R,bq2G Du"u[G>u[G֢ (aS:L gbC1[c|NHSf1UΨJ:7 dH"BHsIwUm6Y8 >\}==UÓK C O9pEH f8 U!i+9^ZV:nS@r0ʉj I-N@ZB!\cS8MZ_GnaDby%*Z,!%ò㘥?HᴲT8&N)~1!iE)<#] 9&HojU@!:1]~'5&F:İ%RDZM$!2M.1 gaP*%B$8e `iLb=y1`%j'3YecJz)i8JkeY줵F Ԣ\>I_wytpˣO#sH;%>l=O_=0/Qːx_~ЧQO'..xK~ /3V燼y|5xkX{oJonC/y:}}Gea r뿁mX4X ~|?};)0LJ"`2?/1˿X//I6^P6cR+ztoM9UD0$5v*vۘ!Bg ٘ KT jjE,)f6u`<&29h=Ԃxir~^i&(aHP>AWNPv QwJ'O4B4cಖL Dǒ^oϳڔalbJC,*,-bdL-['  +K7ҸTN'ƀ5񎂏+o J$ x` S_ӂɰKc`"&F8\,K'` =7080Kw6'+pa%\*%T*ihRe_{E/\ ~w *䖖C%#uKnJ=]s jP8z6c3`[+"ͻ'pдeT3-)B;~V@ܔ3'>5gdMygy__8wTq z*;ZZdlMr`8vGCP}`yV>P?X)]7W~/(UX9=go&ƾF Hɞllt{( E:jз7i9$"^ϡ46kyئkxIK>PM5Y #Ri6^^ܷj N 05˖S֚us9,j9]smY)E+;Y+^:cŇΖGOCtJZN,WN`ZN䷸/O8Dݹ#DJ lI!)&QB_!(j kϴ(89vѷՇF;EuO3q/O6 d*#rC\J0*R-C)wR0MLbPhЭǠ&Rt|DxoM7U~ /P12ΫZG{Pn-;U0Ǥz>r~IY_aV-G h{.WXo$^Z SY 7+jla > _hQtu[V;Ff Xxk <Wox7{N&J֛W"d1lZ!_tYYc[Ӷ KpVi1R>ߥʄ#+ڬō*:{lSH8g5YK*F'JEOӇ_+ (PMjFZRј*XJ\ZF] Z}-Z+7/uPJ Zϑ Bvϑ=Xxٷ"3GYJqQHzNk4Kv<#ՉD ZXrî.I#ц$( Wgc/wl_g)! n14 Wa𖌡ćb Kw 5GlkBN-Ct5f/lW׼ph LD(53fR,)?X[~~x*/^v:pƾTc8_#ˡ/-)uCUGk(j&X=fZ֭j9*BBŘk_JQ} %dW%i`n\SqN_Ԟv嚅y"0<R9V WxMNDGX WZ.E7L.DŽqKq*Np"ۄBЄ *1֦F!깙 &EX{B/bN _?Hwp.7I s=;uXei3KG A\%p\C7wj\յ| 4$&uj1޳h W7mx` & ݙ3^IϚ!zk"Κv1f>Ž3]yO /8sf%Bs]:J)94ty#cSq&4u[٪%)B<:$(\ ~gݼo-d#$0(әulo"yA:<";6bcJ(v,vuئ14fT9ꐴBD1nEZb9ѿϻ,~w;x ,Yk;,kwf/0{fO,{`vg ~v77>>9>d,}R(,Y6>hLCcy3mL9ɔ#L9ɔb2eqi#Lq.8P)A?{WƑ E6C$'6 ;}B",2"}8PÑH鮩Jx~JmI{0 P~ʩg,לFI%+ 3{{=q p恭oq.了}NcҖ$ pN*F z|)1FDPJKCGzapkk-YNZ?P:k ls~`>+& Ƥ #UB 4J h:R#NDs/1I$Du][ֆw\krⴌ)f,;m.J["$p~qu^u,yw7LD u[~M66ɂТH_ߏ8'k)#8Bʯ!Ws;<&g͟]3l8:ڲ1qxwq#eTEƨݽk :U*K0[wժ&V \$iUk\> <,*rr|CühM 8]hNTPI x`I>X}o/NHrz |/- l,o>ӜEY2aFDfwIG ݢeBK˸g@8FNԼV|\H xfT$eU"&QIAH͢unu ͚zޒwjュGaE1B D磋8,6otP.$OɆ-Q! FjCK"AJ''H牒a0pxt4R#*aKi04IDN :Gpε)e #EN1>hI+)$XuyW)u4ɼ/L҆qhx"$q-F*達9^@(BPj ʩԈ(m QtR1*jrTJ łuRtP±i+@;P^&tU|'1V5VuXU疏ujA1tQSΟIuz4Y: Q2UaX,X)PhelSbnN yy9Hr*Ee,] LWjqO>먓U0Q: R}PQ`՚zmg6$!Rp5C~nZScvbÅt>:wk1^`!]IJEb@ NTk/P)_'NhQti@v?Bйu!,[)gD7ʆ&󼏡h>HM% tLtDZ1 itg=!8a6%C JFԉdiwi[U.D\b)z^TXtsQy:1YA]*苎K|,g^.z#1ILs5 iCSr7D'DtJBGyʧONW@Iݩܤg_& t{_|0Z}I+\~7y;Uo(s8Յ~w7T EJv~u0^ A$/<%a g`$j֮fU1wQYQn2Vm6&RSIi#-:M Y*rAEcDW%`ԇ%֡K˲r[7̇ߥJVbN(Ժ$÷Z2kNQ߹3^5_ѭ"1'>뮑 zrlM kMSҨ0\ɗ )&2YZњV+u$O|)7ߥ7&w/QlRWq`0lSoypys£~MgpӋ~tG4\ L8ȯ| !3c!zǹvt&ĉ iulg 7S.<]cZmE!2\gY_xtk]@1BG Ն;~iaC|Uu,}B/ V2;+b?dH}BٟY/(.+5t^$>/FohlK9=ҭP5mf?1n0^Y{浮 gCBNq^Ry72@!Eqe,j"n4g3T}P˞NzN%m: 480 ۭ*sg>Q|h M  Aqa/'sVX((RDE$r*Ɯ8T`Zxf].Ē{uQg9yq1tr.}etaXȤK 쾅ȗU9œ7b]`zrr!i%JKɶg>/wM!F@ԜYAd&Fb7D7{\a7鳵p1Uݜ܇uVy] ;>QdNMTgYq0ϕ2h4dkrԼ}]dm 4^UHjv@S~Z^KmmJaw~lK 5_+f/!6)ktN *wZWNmFy8ǀ%!eR+{ VнD#v.(PI 9)XJ^}ZRhpe3NUyji4>p\+)rJWtP}7g^*kǃbtI@F#$EuHFT*}2;e-p&>o|#NZt YQ?BךLe-hotzVHmmN8N>|6+)wCE NxO( &;J"DlLhX1[@$GCLvv>=)gO&z+*>@W*)濽o' rN[IāoU&o-\DJftp3L9i5uhS"i8<̠$]^A F$Z'Jc(fFXJ}؛הHaFtX@=A2tX9 Idi9foOWjCJ:\cT˫nim<ͻyXUc889* 5D[jZX/TFY1Lc5n;P+5n;m ռiN \ r7>knI|m݋8_BYQ8gg X{FTp 临ߗR).'Q[+ Z4vǣDS~)o±s2*Ok݆sE=ki0/gHu,̂1- X(y5RZLZWVqTe]FH %Tw"D2 _'έE qpd.Y[Z5*9PɴS)'8لC'ǘRߒGV [׊{B.toՊ&oF 5b%Ա@*!b4[hJ R1s꾣 9( vVQ֣'uhp CDOgHgЛ[&:#dڮ6A8*.`@ #p.< #8zGh 2FਓDt%uףnlx%ai4\n.Osסv:ۼQ0qNH(7[l~b=?mKۻXI5JI1/"؆0YZT~$}4Ɠ鵾ƃ᧣>~JGaU}N?_G(M]l8:PpDO.~|U?Hޟ/ܾk-k:5j$-i(eRY`@er]ͅgC/b4,./شL[ߪ7(KRӥ>8otyƪrA8Flzg?oyC: =7gўOwE*(&9- ·6LfD RO//_o^ڝd~Կa@bCYl sR!cQj(2X1Ԧަpm~t^|7A4KP e;d8ˇ':oK&?V4+ C: k0toB4ՂMgZbY 02]gb'.?{1/[8mFH z\3:orIj3#(fy ^ 7NI`j0ՖZ$#ρ:ٖb?ga3g6T륫OM&HN [SH=`$k_5O_Mc[_,w m4tC2IWWr(vIg }cK/S bR|X8)A(b )l XJHpPhȯ0 0 Ҙ@Z[4SψX nDV:Bap2K9kmMY6Ơ5@(2?oRFddkIY.A*a.e(T -E"lZZjU)cj[/* %{ٿPv&TҭMJs 2<ݽϠ:=w[h72~_{j'aV .'[9YG'vi}- Jmr;v=|~F" -s }V@>zYm?ɂ4Z7Ѿҭ]GiӸ.;dh7{m (z?ɇՑlD⻧hB/{V}lw5x>wy꫋GO7﷼_/#ǭ='q[/ e+VOέR@qp v-cm[e]rJrHknoe罳y]LZhJG6hn?nP-*7y`=Ry!E*pkWHdrhS0[]8?g4reU!kaa;4_4cAyzoO_ntS%y]䅲k m1H?swX_aE^JY$䍋h#*&j{]a!hN9h#Lp,Jnkgj6$䍋h#b6ƒnM1snGS Gܛvk~Q#S!!o\D)V`h{a $Y(]_6G29XAC,ž4D%g]w2嶲z#4K@ !,B"04.!jHw-S6Q[٦,Fm*}PLR0oc?|U~4 @n1U>w,DZ~(M7hR t]uE{ U:J͢lNo^U)e.KW Pjl߯,b.tz{3jw7YJ%Ŏł}cy 9 #&bEimf!7ɠ-"!Y3攜 VO|eQ*8bG~t!7y6&Ku^)5&+Mc.dJ01 ZtY[2'R@{=@9)p``Xa^DWE.xgIR"IθEpYf,5q=šgYQ.G8W"sm3vl^ܽ~u@U+ i9\[hE]".:^O?./ -s pgej!:͛']Pw-J޶-Lv߮`4ڬ/驅٤X$Z4ʼn\lLD U޸AFeT߰@Q_Kp]uP:l|[# l_F7&9K,g6)/dZ{(L SU kRkdqÃT934%&  (HV\pa[n&?&"4~`bYX+;3 X`TsSpTe-U:R\pRi(%Ĺh`:@*5z9rГS锶 cF "DATs/ I=uE'R 7z X) R D3dG=iz4DO%&KiKPDT܀9 8!+|@Eep(XBn8K|)71 IѠ"ր56l ns!+hTEy΢MڜDXl0Am$:K+OMF.-p@jPu1Wx.׳Ib $}gR$f{Z} ?{=bfϟ_9QK9mTc食n}%[pjF.?et\?F㛪+0ɇX}_0!pťR0F./Fz:r-O]EeсTr}ɻѭy]zӬ*sY$I#Kk̜㉬ HCp %(iFATNEW)715[5tjEхqԮ6pO@q:wX(Z=1v9 ०fngҩ;Gք' }NecQa{PZ}Ӏ/~zV&\HD;lE8=Y8*G|>R7ͦb>̀z愚/,|mMUlx2%w!J1?Ž"ދUf`,>dq怆'Ιc8iҔ; ލ˥C:]WrT9g#rw՛xapkxge!CKQF:k-K杈/Kf=TQj^~g˰f:ptJ['Bv'ӽKv5muQ^F$Z$[C⮝m5xzkKf=dćbHkˆ̪g{_UD.Bh_\EteoZdou& lX+B%WZsoK\HsΜAͶz v h. v:Eװi%#tH:^D(5lpSL<& kbB31WHHKCv0HSc=W!Fx̵(aNH/ ۼZ3IRUSnNn6~&/GK.G3Yr'u: fNoF &O,,Zp\ !{MX1`n!\E\I,@1#g@811mJ8JZNV;zA^O+aAS>mԢTNƳ{błinE(0,1}<4%C+@;ꍎH'վ{2%~tӴt)EH#'Q!N##65 y-qnh9#d ,U,lFjm<0h7W{(5qĝ`Nbzlyybja{iAThe 1) Z. ֜9* v$A*j%꧿߆Y1CDu*g8}&mww~_a ݫe'.+Y}]43c|znlXefn)4ɧ[OZ0u({wT| VbIu*w;E/eMﳇpSnoƦu pKJju!#Z|Pg˸wJXloGɞZ>m\| -ԃ8e;Y g˨qIųP7ƯӅN-1N!%BKRrt>> BJФL"ue)tЩtQAyV.D,#Wwl1Ěy@OܸLC' )x&˟5l㤘{Sf[dJ,l;? :ZH>?0/ ;:DHL;n}tfr#⪧y?]7%NO; jy[?bTzXX mPfG@ C|VXNϪyp)(ؽ?;UsJl]&6$i1V!xNc8W𻩵MÌީnG$rA nZˋ_lD(dj5ӻU2%B-g%~6 U)N:&u :lHrD6$(kIc짵x[f纣M褈tx(JV+%/UVnwӑmD2%ldB<Ȍ[|VK)ˤ]!)Rfo?6뷓w[w .eBsb5yeJvsHݫLL5\iYa2:f>zZ6~6TbE{msX#Pem-=ޭ8 W<H@SeQ `,̈me)9T2%X *7kQ pmmYq -Y;SSpWBFX}xZf~1MdnY'3crAo=]^08;iIZ?uĐ3 n'$gs~'e[8yV#IZ'[̆[5k>5LHy|;)1 \ zi1~yӥ*78~o|qʕaG/_[`I ڌ9ҀxL$-FpqAS  BPC@J..M FU.#9mK QkLDQ]o[WYܽ+.bhд~#֖Xߡ$GobA 3;XuZ_NO'+'[u>V+W1C wz֘ = sK7Z;*8g,E_ b*}+^Zj k 5ҫg(2MW;_\JMzi5`f=)=Z`G-]S#nIgxAu~wy?GX }\Y7r_B/!8SGo7E% 0W?ߧB*RO|{ePLf9s6cBDl@e| -/U7?;eu8]RdH+oyn {ְvꗅک_15ʴZu\LU–1z $C=/I,,guێo:ߍ;?!|F.|ӯtRZ6qZP qso(W)c!h "P5P05|avnVK`́>Mq2$#3?Pt6`5mhk~UoZzjMeJwk:ZN֯Sr%k-;|9ee P@dʚ(8\z6f o݁j oWCd D?^PI]SWSsmY.I &5<, E;?L^vU\ku;+DžMV1[AZFKB!ʇbH<锣9h!HEG c@.@YVP҂…$8@t-` W$l+IW&]9,jZ)]*~j5u"":E k,5ͭu̒h Y43Cv2 w3MSA&~"$x唌)ސEx`Wxf5 }=  8nnϡ2F!FU竎)}ȖFxEAŔj &:DX\d{4Ȉ0l1[8yFqj[&@dn*/$G1'RHAࠠÉHkY~@`L1Ơ Zi|!8BJנZPih[ݝjvJw SsʥS)T}+J}ČZٟ "D 5n Hxw<ϵѠ!S|Z!ͳgÿ"K9zW mSnRP5UmR3.|rahc \o-ѱɴSZZysCie#H*.Pd`X&y259LM!SKBtSZ|Fp~8M8L3p\VQ9 =8PoJ2+Ÿ+wŕ{e&Zj%U;%Vs?סRʥ>׏8F*գHeIjZ\v#`Nei!-jܠƇPTŜ{qږRJbu^̘)D}a)l*Nr[xf*Ԓ V~V 3KГƫ]erU(4 Dh=S0 H3V&D+!zu2Zhr*]mf.#,k&80g{%A`()*BS@NڲH2*:u:Nsx|Pw[rhGwh_:gdf-W7 ⟇C'p@,40^hp4 -G%qEi0TB%pV蓃L,yˊx@h`&n@MlD7ZpJXxqecܚꠧ-eYͲ4N\{ 6Z-]wUrM\Iv\r{;zu}xs3*,~S7(7ErVZ/B.__V Tܰ{W$. mEz 2~VBzg5^-MVqeZTΗrA>ªXwmUˏ.y\#]s": C[Dd=[sNFS%BxNэUXOZ>G>U8O?کXH~ϯZ.4iq&Z=) ')` qBPkJ]- `x8p'@]q@;8:upx"*%#c2*uFj5!u׳Tx>TNz^ ?E@8Wbk8+|pŧWbF x㋗ߡХO~&_ g.q0.Xf4ruπ?-Pq<hC0ƕDꯔqS JP%5rғ ip%ʝ*Rj!xC*L6}cULͪ N[Ӂ6ڨnx uN*tJ1.)RGi  B)"YU I#utW -Sr',0uR%WK=)4x7/q&.R ͤfMB WQ((& RpRF@hxA92D(-zX®r%Oqfr\"Û/[Jrhռ۽soqe**[}٧g6UA1㆗RA⪆SJk&6I穏76~$i0riЇ5lX(O},Qh'Y9DtczkMD>ZܽO;''V cz;z3)U1DAhGO ^9KKuRhJnT=&/+TnZ3b%jfwALҴ ]c(-REeO^zE%iVJM:Ol#"%2]VQx7Tp+XP:T -E[yö W-j-g{~q|h%gկe&<]myŔ/.jaKq7? ݬU Qu) rzľR*Z'zzbk`U;?7#WT4Z oz$BJ.gRvW( rb֦RJu.*cx$֭foaRρs OaTC_tkU@dc'M;,tf2QeD9btᄀC^}Pd_~N+u:(' q5hEemz)r2Sl}Z>K93Q7ngCu/ZmEVibjgt솘`?ۛY,Iyu誜/ gs/)ck`Qܸl,|avPMKn̂)L/`*Pd =X-Ԍ&> S)͇Is`c!w(k40˩Woϐzr EaVR1W 0LyJ^X&40/vgU[K]-B3' b(G31(r!FoR ]pfBMw`)D2LC=*rsT؞BR`ҿ!ھ0y8tprRE!mauD&X@az'6PMk_-#LkieQP-ɻV Q*`x t |*)ԡs Q#^C1sm+*}U8c|U.{rڍ㲓I35M?z  $EJ%#n4ЍFalTfnM]wb.d%wtbb%!p%!ܺ/ɕ9ٹ4q fl:_H!l;2ʒAsoF1`6'r2teS?JxSY!0œ(JD"qc0`&pBBetT̪V]jII8 90ffb{_" @A@sdt/ 8 @i@[Rn nqYr ez+2y`WՁEbEw4jkA68 c[IlXng@=Bgla)mԯ6;_ x,2JP)ٜ LrEI x}.z2b֚]z]~YpۻUӭ=C."M6`?%}trp#glؐXae1nq} uwdwE@ƋYi^9^e^UZ pJ1Fl~ RpN!?vګ zbe{%w8ܝ:/wQS* 6_ogRnzIEC˓C^uQ'դsˬՂP3}KZAj0j/٪(8Td]QLƤA;ѡS^84el,$F+pB,:779Ԟ\G"gyjX=E{I_EdPr B 4$ ؊,4] 8Q<$!‰B(Di" *FF,y)8z= 6z' wK];zA ;`Go :Ic8C"TL1 q$$8IX"$ snV'tYb1r1ZSۍ*lw) um$]K lrNe<22/aϬū Նƃ Xt5q)zDH*Q)AF(@BE)u}.f'L&ܴnv\ 80R ( 8N02xkLڪ4:jՠR C$(T!,@0(*OL @1E@:iXȍ,¦b3{IZ MQj|0{շRgdf_-!OW rp#8nCB# ;Uڋy_5:jJŎ|a7R&$'{[LIˢ)f I}hq@ɱEvo(|3lgR0gfiNb8&ҽWks< Uo4άe ;xc`۹e7%})_eu^<-wV);eǓw^ekcf>?;|dv&3de{ݛs}]|"!Sk/[n #拁QEnhypkVvCBq=^L4zYfld$DNnRMWj{ c-~rOڑJ 0BsOs_9' ;O$-HOT0T(;֤ m_G3NJBn 1f" eT,J"HPPDR0Vb4+Pd xsx@j9V1%UR]xo7,9CXLNT&=RIR6E"e"7Nbi9L%1Ggr-Aֻhzm GGC 3ZJ0a q"$).y aكV(A,618O:CIM*=ҷ*~ӭ_EiQ|"%Sw^'oi7[f]noU)YI hMyU!!߸/Sf [՝b4Uei0y '.2Kˏ Y*^RQ dak9mO .ӐspNyTH'Am07uMQ]aX8t͸&a '&YSsRwFiFF/ ll jSMW)@"oT)qd\8ƜdVLP,tŕCR⥗(^{}쏾>x٬e=:? g|~A)b K|-o k/;2 h'ho莟L2%u%Z`hy)lֶ߾=y;2˲0I j<V}?e'3 ejYlQcu %GU=t󛡪/Qq~damVpO x@)xP~OAeƶi'#?~j1 S3* Zdڹbē-]d3ތa~]aD=({UpyZY47B,p߈ۉr1ε͋Mr\p%C/|u{M 0\;ioƑu$f"Jv *s(z26JI34 6Q`tH4 jiY;b,Dn%e|LM$U)܋,HhXnA-(|Ip^{!66 6??O%"!puq4ͮ<'jTpL'fBZȗ4K#P>ZJT#eM^2u3W֕'!iw?e `$P@@Bk$D,PB]Z̓|['ԉ}z7CD f\dfqN,!M tdW"d$ l:0DhސX\ |piz'jۼZ.0v%M'/J8vj~Nji2("ZTDg'nLqx:&, :.΄ΎfB| mR 4?6dZ?-'#'@T<070-> dB.37ɷ49*@-F|P.@3 XE,p!i$2HhW͐VP .mR m:fu֟~?Sx:VIFa* !29)E)$aP*E7pqNTqWcԡ:`for _ V7| 'kVS%}m:w˄~`Aɳ[R82CPZL8z]\ `1/~R!"]>kj i4R[ שi]L\w/qldwƔV:[&Q[3_j)+P1UClڤ!6^kzR[70䌟ɺQdw6@7Xf?+m o"̰)Gˍ)0-|j YtwrŘCYѶ Wtdwl 5#}q_j|Y CM .jR'UHٜnMõ} ([xrwdѰ+b]H$isΌN/#%%ܝ+sK57 cKHr%>0ޙPi`KjKZeU9x oaɽZN':/P‹;w'`3֍'//vėx=~q(f'SzW~!-DQZ lP]CB:ŏok_n.c8i+1,/݃)& C|[42Eɨp$%ƕP%fu,)C,o5)ÔA&4*x-/E =$nb$YK1LvL]҉䒵W;b7Q EOSd-FVl=5:`H> `UӰ'2<}7-c"e~^1m%%pĬm$dJHR89rDeHbPc(X7BlEjK3yn5lͦA<7pv?#7~o%`1T<|M6k4[e=ճgݛeUl>}Qy706s=EBG*1AT a*4sM߻񧏷o4fRT&M\4ntLUI˻7Ͳ8cC$PQdP{ I ~QbH]jtEww7Owoayo, 6|lyֿ~O\83*נy!O7tjƦpRq4O˟ #T,Vwvw07W2kbpm]Ewe.[.[?ljuY.eVjj՘%˸"t-+&EP᫩.7{5q(9ŸN8&Q7 9tq7;_=j:THsz~p%keY:qA;w%+|9ĢN ɜ51?[`w_7 '#ȝps%ڪ~7oj  uhD7!ێnbO ҜHPr B 4ALp(C"a] \pcLBv67"$F/mF`GĕLTm{#X* 29p\@oc7ʽ/a?=B}zȅ5bDzNV$0{w6rFȽbޢ8,bB8qs.L Qh& !iP 4)a w;O97ϛm~ FZ(jR>`efi.e߽oJu[s?^/X[mX,'~K}b $h  +Í[X3$SPR3O2kd;_d'30Pq1 dN Ɓ$cS)a3bЃXEi* m,r~FFyiäxm p\!X7!1Pkj@b/Vxcgdzf_,! P.22QBi2, e(EF2wF@0as* aaLaw&LƅPc}rxqIЋ`dBIׁTVx6naEאzou{PKҙnES1ǀx=?z4ep^?lzS+F5wOs3]<ã &w尋|{)BE_7fMܕ'HMaV;'Z]PiQ"Lh | H]^.L`'FRdV\92tLD}G_/],{{b?<M}!kJW?ǁCp<8u'~t&ń5@CHd $H߽z+F"65Y؎CǧLz|R'QPIHG~^|` |{e-&ip{&%CY x+P]U55&Yy 3S ad*AVVq%H/q}෇HsG xՆ!e!C] ]P_pPwquVVP : 6yy=?_8CDB>Ym<٩ id'3)dJ=k:_wl~FSܥhjB!ǥ)z|BX=9][<\lHI󶣍WqiW!CǚGӼ>E6a ,a)H2J UJF7Kc۔#G6sYz w+GLپ^Xf=JDMi~+ kkȥHѝ[::kGzE֛73Ȣtԧsua;ε-$0u,P/+b*)&{+Ǥ䞤x01pRbR>2G8kmeKo<_ЮI}]uBHVc6[ŋ;'Z>͹%tY!CǓD`;5<`[Ck! g,,y V LXja@5lL$u&u]g ^eh޺~vD~9\HJMò߿统Z0! ΎrHsWYuѧnE*SҞ*sڋ(o3 ߯vK2Y:`y6߬5Jy 7%2Qՙ;sWhqz3dp:~JJ1Q*@!c 4[[ aÈj޿8FJ,1pC=Û:tR0SLI#/CuMp;!BǦ&UO>" P$tڛR^VgN y_חv|Nӧ؟1S2آoNѝ߽v_EL$wKe=} ^`镤E3]fV>{ىfW~}=sj4ɞYϯCr^0EoZ,F^R:fM]^s/ޕ+谸z:= \BeEv4Ob 7 kA>5o9Y@ֆr)a-Ma\6wf|aPs &NDH̍Jh0Rn{A( 7LW!ž)B+dcɏi0q_|^юِ$Ȥ[MrGBZ4T0yηvlZ:`r}mԉݡ~p^[zpSCufml *jqZk 80,pf: .ؾBmrqhFkuErOFbO!cԐ[dsČDsD I*3rvLvj0G*5J_|0XbmA ,-~vre|vw.LB!Zr O{n!0qre<8L3df A%HS,~KђkXQ8ɭ2㲦*%h " j fep?~)yO(O7{"NĝqՃia=vT}H! Aoχ f-gG?{ufrؐ ܟ?|5t6_*d|{5tQw&lQw|tVV Po\mv%o\|~gnJ%0&X.7zŐG)1A+BeX Αshfca2XEH*˃ea)Mn6^*g!3 >s|cٜ2Jnj.^pܾ)n 0Q8+F:M 0] >C,9b8+@+2jPdb$F"Xi Yc3 e:K~gbl&,f[g1=fʓx[goO.'oD}[ }1+ b@yZ 퐷sr\ܡHHKq˃;k!w< < ROgĽ37K t9 )eXjCcd\'3K mKYFlCJ: Ŵ}m= .N +6KJK 6 ?ÁB ('FC4b-[z[bJd2LPst98a.nk@9F lwci Pg[o?kAx<א]C5xs2(׬<^!Ҝ.,s"kl9ۨ)Ԗ->\ [%.k]\Wx"!ۄy[s>HJ$ ?Jztp*,لB%}aCj!ot5ipi9[{V =h.IB X -o6w4Aמ-"B[Q;aUJL$GsrH)) tDzv]sA1IMhǰ@@i˴lC(7 R1m4L e}g7dhKF>sD Y-SBɠ4"֏Ts&"$}?8yd(2$0Zs3SaI1Izb#XhB8p`hΥ 0&ψHZA 3§S)a8`)Qi謗Qi48¤*Lb,g7"@fll`Ƒ[4;PƱ@ E*CXAڃsojATk /d/YQ $d8^m&]0y?v0Ahv}J'8#+3U7BP4sR~Rq+*:pvXU/\4z&mlS4x]|րrݔ jE@ (*)u1q5V:ݱ!WNT%D`[Ӻ;z55JDZ3>ՓhGDҎfzMbey{w@E#qPw'Q2lM&b!F$'lQƝ‚H.A|}U`<ː;Wn?Z6H58Bh4K _vXyIܠ = p'm@mV^>O+׃D/}tNkRiy pg4dۭu^cBh_z|kL}iY0mfhQ;}^am+:d{R"G.@(T&wm8ǥf*I WHAKsi^M:gU. C/3.gd,-^AE'/9RfHXȼYnvxuS֮/022)4l8_cg`HA(Dd~< /_CQ>_o%]Հ?M} S<)͋gfq K/ 'pv6sO#.5$Hi~;&22 Sw){,y!F'7%&hYvd㇋bbH4)Ԉ+ Ugh x%')O ᧐ZήP#yX,#tȈ-|DI'OGJԤerjqDai|oS Ѳ+t}_5ӧ$t&3 Wȫ\8"dĝ9{~v$d^!R+˷?+/zJ 8Oh?lv.$m;kPK܏t׹*\Z}+ /'^kHB*Bsb/ 28zdޔ,|1^ EIݧ&B`,0R 8Mwfh;׷6c4f3k(\8ЎZ~tgTilU{heX f@:o}&Pos4Fnm?ߟ"sO/qظ~>?ܓei~j4f:czJ?c `ԜяOˇolxNX +uar -hWX?yCx8ϯ,6ޢ|z!HZ\Qd^_(1ȎddY*X:Enjx1F1^p3*߼";xN8sk#}|̍f j2%MdY{C$`Q5h ]u[dտJ.7A "NL I][xge*Rig!u %$\dfj+HƝbk9%vOK> ʂ, ,yU(gABAtYCV#e|$kJǜ1&46K`Vs΀*~9 M ,{j۹vWw,=P+p> )bw(Bz7e ǪFE)J5AHRfk֞|Uf=ݓ,=( c rYg?-\?gYAy/ /y1pvd>!1E;Ďx+A)<ςx„R 8D3pѳ@eMaɑ>}y< ,yik,EVh9V`ϔs" [m ۦ#"J]qccaODޛG&ӏKLFX23dW"̼Yff"SQ5pYYM1>6G[+ k_yE z|>`=γg/ÿ".KFͷ>vf8|L) Gz±<8:?zoù\\(B1un?ӥTKMyÁX!iMT0[#\* a aefȔ +o4( g8|@+nû5AմHh&,1FtAv" IV¾{v>Ւ g0U9^@KNw&[Nk]ՃDr`Z ζ~z7PH(Ĭk)g~! ̷ۤ$ySYW)ի<<ԂjF.n47q%SQBE:&yk$*A X7 R/vF N"E#kd$6lf[3S97~/ @.GhD|q| n,c+38ïukqMu1s>y=h3(q_GFq/qqŎ~ӄf_C,>?an`7q8My*s8٠(FEף~[cg?/l.Gvq[]셌dpiM8BI٩U2O w1197 k'ZY.&dn|Ru\Lp 'cN*#Jq`< +-ޯ{yU y4wcofCu YDcl3OX*r\@3KÏ@%V%!ꌧ\gAYl=E!'^p1!AQ X=3bR|lG j1~[l9Ir/4JЗU% *)^TPqƝ:Vu.djN`)XZ rҞWʼn1( y`֭WI7ܲ1$`Ii4V6#c)24 =ol@&Xb0O˄q6(lv+YT5"'e̞+aQ9\5k"P-iqK-})}]}AB(uzwq>ZRjI#YV1wk0$x`8|G5 .ac>xSfz!.6DXF`Ρ-YT6A4W'U%@:%a;_ RvAn V |=KIe0u8YI9p@m*/+>^ՠ=Ke뷔:b9O"㪷?g'ў^F{=˧dm#!+]ϯ2JuX'_)Y"If RMp ~ p A"e#;hlchdo%A=(1\%mׄN0L ['<|$D>lKi88T)ɉӽsKp4^uf&,[i8YȻ/uqoά=sb&0"KY]]VsPQ*B?n:%GxGC,Ts޼rU{嚓W]bjYq(`zE^}Ni"%Ϋ;ń6Ǥۓfn{q &ĺX71&bݔDƮWF| ɅƬ2l%a.L߁Rgr+d`%RKsD >0:N,8ǛĉXA(ALB*^ԦMméQHG!) R'YG'P"M;= >Ri0q՝U-N+ZW -& M)L^ MRhнֳ?Ȭn1 < ppsދ\^8dm?,rKŀ%1C\""/.6C_b]qu΢xv:m]0%3avv);K8Y w_F.F4,U /I ;/)8]$pdN1a0; E>'[tenf u:{!ϾAЬtȬNd.9Ua=߈&Oj2.֣8#%`_P/GֈjM ŭw0Ǻ; !"k?{-`l[U}%edPna+=ڰOj-ŐJw( B՗)Z^܁iȗzG$51Sߝ^kvD39=|>wv]jp  DSw&+{UR 7H vWF-DU^iQ|yo|6, Ƿ5\W->?}իM؏n4ziMF p5j)+P tlUtKV\l8Uų}Wx<<ܟ4u_Us.T\0ahȼDi#fN88SXL8pPӼes77[#D,(:$ ާƵCƃ4K؅E=n38)3ibeI3-xȐ RS@VtFr$0cE,ƗY`mqV qJ7.dlpGgARB{a D 6&s6|:[TJ_񥽲]+NWԿcQ~++Y_<܋]8Kʊ{u|/Q̈́W\\].vO@VuNŊ )ذ?XO] T4r5oN&.VqKϟ%8c*,jvT(㝐VL#5]TڶӦ1гĻ[3+|BoWЧ_%{hFdUs=5mn ><){+Ʉ8}K+, TlK_$, +D\9R!UNo;k /\}KJgX;<$c^đy >nmw7=[2p; Ap]GJpW4;7FZo[HNZFup g:m[>˺-<7F-ͯQev|C9!Gb`Qm3l`/>k9 B e7#=yۂ&(8o[rڷ =OVlFMwGM^Iigy8Muq#VޕVpV7t^} poh'D/C~.OtLtey`JiٗTM:h޺*ZxS>03hClw1u6gɈ[2F+s RkJ>?(SVkw. &`\V*YNJ"!˸ gВV·ܣY N9t qQCPP*,g4ɍġ< dl[Ә>CfA k'=88eF#;yO '7e'~&Xn }駜&WOqZnC`΃ӬCe8a89t.$+SgzK:A 5ʌxXQHXT!0eguǍ*gW.]'S u0)%1 $~n{Pw7ܗ2ll晥;٣ӣp',?b.H1{|#G[\F?TV@23t͏}>߯Ż|Z])3oI\+|&pm}C/]+A 7˳g;G7<ֹҵqDGn>rbBo?}|oˣ}O5z41p>$G:(&ozVY\[)=}Ý(B[Q|<ގ>f O2)5(w8q{zbS,<MB_ёlSq8B4iQ=c͔N1Վj]!Qtz>/8D5C0Ԡ:VQ)􌯟=!:lǵonYUs3p[7xgK7IA`TMV6MgHBK@6]tm;,>h1XZ?i$>%nRMqI=7M2GTd b-H b5HP˸5FK5O೵&ށvMjk:k(ɻ?h\P2Mݾb,Ӽ^yzFI``^ 3N!v +N4Ry*TCXS;gu.͂,$C &[Qb}Q+W^G/tյ,9Uc5)zA;,KUW#z9Z2sO={:nY8B/oU4pi_,w'Jzff)4:hSi0~裨4 4Trˆf>ӒZRvh>HTRW.F>BLJ!n{ʥ**AAiPo1zɪA_м]/]l|4ڼy|w3|Z=#rQ/2W,ϚkVJ:c ?mޝL'o?܏d $i{pU"[[} &-ċV&/ly%_Њt+g@'_|'0yz`qu=! lj(d>mWfU$QJ ԡ""8%ER4M:J{vC!N﫹#Y][xlԬq.-BˋiJ#p^;rL4Zw&ٴ&P&;A4Ts㞺JTJ"򊫠+CSPie .` Ug]۾d/6*-P0/yۗ y,༵3ԍCjCڠ䰚Rћ{r/V;gLn&N%, qTK0uaDLi֟ %RȮ; zc#M3\:ӻN%1ʌW6UtM%ϕ6y2w8v]!N!1(8 'Cub$!ÅQH2a4ڥCԨtHaNtm `@z$qsg3=2ӳRxaxj`Ұ34p⸴*r1p{Z *n8P ?X Ts4'jA(Ƚ:PTr:rGJl8JΌٺqyTrH`TKБ7Qr!ut!IBlOGEH5`fHU Г!Q,b>'(s V[БaYkZGoY'=R3)$BmB&%PKH|%*}dGrF!eF` t`nsVSۭruI< D'8"KTt OxIG@"EN ɼ8Fpy'$&pHD{rǣg(D jxٮ 0\D{3*9cQ =2*ќ7)`I3Ԙ NWPC/ߟߝ0E,g;:;;k}Z.BEW_gri5`|6xU榊 /䶖ooI*fPٻ7n$noOC`%k#3${汒ƏߢiI--[-ÁagFTwWdXͶI1s>(#ެ}°:&pLHtHR5GQE[.vP vViR" (X RxU(1V0 `iqإnS\ԊBس 1)&AZYhsB^J[( '90OrFp?$(\Ma$Z=1Us0G`yrI.# yKJFk٪@#b[22Dp Vk1: ʼnlrRB أXH Qڇj!5:ᬕXWZw{;42.q 1fqY+BGK jTo-t؈"+2~aVI/z~ ʘ RgnY'h1_] aH#zV։4+Q"K$3P-{a^#=$伭5h N, *d]pI\Rd8Iw(D;IB8/mK_g.1=~sAlor.\ʀ9Ì0" K# lf*U.,,"rbbL-߂y<>2M~{s*6/DԲ5OV@sE8/~ț._%9ZY7/nPChvp=]~,f&02TGHea;gf=8J^*deD'x)ƘnoZ.l[եyш!BC}Nt0Vj'?@n`aWf~E$+"j\{&$ZuoZ"~>!ܔt4ak@>Hthį;wͰ!Љ%7 26ߘ4g}4z<ŵh޵#hqg%JA~5[ե )܅wucXYxҘq+F 5 IհoG)EYJba^U[u9Zm acUV!9 vg{0\uҶ~r#d1󃬞o@ъy4G;.8 GE=޺rtJ=sC5>װ򍀓&=J~p8?Ea͗Wmҿ͞#,ghKfg{‰Э$^<{!fiN]$l9L:c$+и)P+ėg/Ix$yvI# K\7[ 3'3F\E0B|;Ca>Mý* fji!Z؁6(?8C7G)onQ{|Cȩ&Cu4fa\6QB$ѝ3Z*t=XrtSpTL5wLR E@Q 5tȅ PK chMI5pǑDNȢ(^4^,+4cLS20IC{*%4Ȝ~p0T1Q87f(Lp[q(>jb'62%1hOb(1FQ/@& |;Rٛ4jM]#g%\|3wid׉ɳKƕZgT%zt@Ј}#*wi#í%LNJ]3/j !>޿5(@R<ǝ^ NĄ^ 'LG_ cڋ7ZN@32s~Q>>J4K+Ѹ)Wf!ba~\ןct1>3(R˽(wא3Ң)@kg_W\l?H ^'Pz c [/QM:^$D_-Gn1VW@Qcdt$9]hǞ;6s歽}ϖ0Eʪ^b0{ZulvfW[1ï'ẏ }k&z$ͭ93 /GWn?A=7멱~׸kMjAzábGĖNR^kgO7[LûFZ9aV,8E-SD[ 11)YI)x Kq1]Q4S^lXK (5$YU=ÊqP0D awKʲa(I\nh;fEg  PF<;y49Y&&dL=bOop1sI@"Oh,DNӨ)3\SuhM4p-h^f'{*1I 䌂=X۳Qڳ8g9K۳aS kWscq4q-w$7DE[$~Xj:*axV~p7 Wn<}\]\~2ɸ\2:Jg\{s ɫVk=n@@uݠL5萹RwhBQ@ZvČ+##>b  zb\G ̖+D&(5lƑNFXszB-DQ0,C B2\Ze!'B;?bE*Ap{VdC+PYݷf,SZCsg:.Ik_SP;N%㸡Nu#Lm }⇰a5ζzZ7 X4Ez*"mWf3(#gl>$I$vdzEִ^:>*q6ۺb8g ?O?(}E4ṋ^`p=;f`o#:.{㈾1ӱ99`]af~p-O҅`po]Q*To߿׷7O>)E Ǘz*kc* x1 ԰C+c&GJs ޞ9Wc;vUǎcI'T.{#‰*O)ꚑ_Ê%ΉݾjYGLEBdQ\[w հKenq\Q՛q4kK.=3i1/2>syn6ɷC8?gN?Ov x4 3TfZB v*do!GP/֬Z"(wϐCțYQoY=Th;*]6ƥ[E U,olFRQ%sٳx>WC"oח`OgVa!뾃GBkE)yꯘH> ˡ6-3ʺ4hmAS*y l&`ѡM:k`ڽ_kuejOrLM dZ~ioLr pÕ~ @e᭟YClvK? wDyOrg}R6Swqk`ݤk5SY4;-ǜ@w~~yT |W#Tk7_8`=pT↧Q%9 @hA02ԸC(Q)qR)=jAׄڪֿ7`^Cye"sMLը(fl}Ó4{. ۺDGGLM}ugk-"S/s1DC^%%-;:a4MVQ]|3u;>>.{ʈ xV5<܊(G'(D*";r 69cSЉR4Ө fotrt '4º7Wd1+xMxWdp҉jSY!]A (x1x߽_x忿ϯܛb<Ͽ:5|4x>}4LDtIkfتNFOaNVKze 3쎅mH:ET˘f dp8ڤUp8թuq wyqiSvt?5}.Ƭ'IJ Cю !8^GJ^(GEurfk-Y8h"\ʬV~/)V~Ͷ&^& 𝽴Ki)#iw7w\zK%4BV1PMaA rfbA`I.6dKϻBgϳL0v$Mt6zW ţr#_<-7>Ufk9 Rrd["iL\s,+?L,N,N,N<6+/g(r &^[I]xbcpoEnCFxAHΈyLjb9̸21#K5I>.$–K2DNbƠs9SƑ&R3GBXciEL7 ?KQ?xا!, I 5@$gg!VNfFyԊrΔR0GS.3ͬSJVH\&Սuc``>Htp< ؐ,z ycC Ǒݗ M P`x,>FQ A:.ņ1(NJ)a[mQH( aYEN+!!scL `!{tȠaH;K6J]n h5I.IV Tな[lUOqj>AU"=TlC@l2FCVcb*y|a`WURtokU{~[oU=.E/_Wgw2:H3L:uUCdu^z>xk+-m-ZvWx[G9-ʴc塴ChHrpf"n.&~*0-.Ejve6joܴM/˧Z{;W{_ݹ o#c jFޡNYc;%X! yV}ЦP;Ǧ;F):W 9?O'9rkl֜ Pl^VlUpN~ H1+\F/Ը5MTRTR#,% y*$KLQPh㗯MDZ(ՐC7B냫.]4&z!B/ Ʋ$Rr>RV3̕oP}'7IEΘbQf$ )A66&3tC+UH䡍/LhOF\`TgFT\^Үl߯]5JRmͣ5"5|OnyW؞<7>s~s >S_cj}=wI:Ag%:7w7`uGd:ӨSSpx],8DQ+홗C&+힚w[Ox*L‰,7&6EK}}ɻ4"ޭ)UL3xývpޭy#Ȋ-|&Ȧd{7M{5Š4}Fv*S:n{Vn),7m N_+K.֤V%6zT1ϨW-:պ̴y-|&0FJ &afأI9 Aj^{!O1dVN0ex/Lc;URa!O3A{LySb8"5CMaߜyͮ݌pg J?[bZ|Oa[]AlWi5lz:Jr;cˌQ3Qy4֒R}TRn[@m@f*mV/ݷ R )3DE7DLE\Bk|_fw9/e57Q\}8~/wSJJ+BwHR.J5_Spܷ&U/j]^RxCh 洣!yD]tl\7Uw|f qT5:;hL(>h΋ 5WZқD6;dKK"RhM@r|ҞÑ"\3dhcJ'}fWRKvDmFK0/+/~l?`vR8?=BN&^kd"&eF|ve o^GNVZ"$X0BUr4"03VBhl(WZN\WH莈E4挶Vb|5eUNJ6gY0nO "~qF,j$33hg燂Xv .cemqHn02mY@E8 d(юz$|E,ƃEAG+?C.b[iHk ,S4Όer/svR9plN>i0x J<Ќ!2ȴr23s0pRR ^sNӭKKuyTY< *&3J`댆yϝ8Df`'px{_Jp:F1qA`|$[c6 542sKE {)+Dkq< q[.i4vhP+$!JA(I &wSp&nt#  S-!S%rbI1-1`y0o;unݿ' { (,kV*jo>,WWkwn2g)&>>[Q8՞Cda2[к6ᔑ#_IK8GU[ˍ9 !+xJ4E{&)q}T#Y,0L%.`5 1!1<4hAwZCmBb ,JT7-GdaU8ZFgXh 6E`Td`(hZB!ta0m3Sc7Aj{#$4Y8=$_o0)uq:)ukz:X ed5V4dNU>\wtRKx y  9 Q@A0o32qDՔ oD1e8ho43@qjokx^ a(kT"Ty>.`%_FW8n{гwe/{f| Jn nㅴz&['zz!Yh,Sn0.&wy3he_A`*^Np݁;ײ-X{{_Zˉ]bÕbXq{YVt ';_PQCQmoշdpW$i=&ՙM{YXp\oF wwv@D GdPGbp..:&Cg?C\a:A^i v'إ}@l\,|qgn L"Z3%0s/(Ǒp NL2M iFp%g.{{{վ;2=ӷ:8ҳ=Նs{'gJb /wEӇ\Mgj)E>-ud#+bnUs\]''਌}'h>|w[ @.+$S 0'քjU9]] AésaR%hv(&='?\#Ob+572š d0 E]Rk3L:Z[~Ix;O9hNۗ,^ z"^P )}]RB% ԞjR>Z5$KJځؚjPc-Ṩu>^9=WXz$#,~wwErŋl &s=MmaR"a%Bf=cif22#n켘Ъo<#چx1y{{TEy/T7޿?zS4Q2۹SEn>wMx-Ɉ'L.TΒѭ)Ftkv+w4KT+n8J#zֿu@U A)?v 1EK 2ϢuH&'Y6lAs#<4z\-yplӣ [E9dn72΃o8\CCً,JfDPfi|Osn6QNUD `ȼ["b- ևNc \J/r4%k蔨Ro Ѱ]!ahF2fL)\Dqv@fP0*,A0`ۮ(ԼOj]A oK#?=(}%|{oCAo5mL 7(RPن40M҂T>͠RT f#˚kgc:)%;-!-I8/xw%ƈTc&@3Ƙ! p-yo wӫ~7.^vYMS)MoǩXT#iCzstmVD DoF8Q"Ǵx4oIw 9Ul\,lOmvр[QC4vhsL8-?L0OL 1Jz̋͝K3SZBRN /LrꌧqR| BI`]@ zũ};0U/ުk_|/tf4K;-w!Yy\b%W[r* $ǍZ<+g:#R*3Wc;Ɉ҄Oa(F HR a ]'i𕀟=WΓ}DY :F6xZZn1$䕋hLUc5w6[(^8F#6cRms܇WEW.!2EAޞn[(>:F6yLcS<[ y"ZP {݀,*GFM?OTk[>Ф*Y + Z{]-֞HaA1RGĚP 6[pJ>0eIrIfA|/"VU $N gхr|ZqP Fv\k,N6,fg!,yd* P_w_z9cTTJeU[ľZ-b1OyS#jQxuv+Ru&c.wm1"^Z1B/t4Zl{5sŅ=cϦ)@~K`hfy1@!ױ/ 3" \% vMq"Tt]E6]uqy*3kRުwJ:vBߚ s[ףּQ{eդlWPW\SSy#DT+iqP=G#z6 *zޭ3 + t|J[m B5@dMoF(eR“T1 =J6ATY,rk(R|;a HYp*hb I, ܩ7E-nnSNjI Wg3_^lO-V\U)^OW.(.j1ut(NDWOO򧥂;>V~|[(QJ1x+!hG?\Z! NSDɏ?׏J`{J !o_$_W}sB;|XG &B I_ܾeT5 4;.?s KM97,Djz m"4Z$÷Qxmnsv+\Uwj mB(r BKj(djSK|)t0!Y|y{q5yN/WC_:*v{{J(c8P峋 .XMhڥͱ:KδS$Z[rY , ~Ҩ=Ԝ kNRp=V0 ' gKߧ|-aib7PVBP[RVj[Ƙ"~"Lqb2FP%ȏ:MFDcaӎ2E6Q16iEZq| ~cl@,~Js`E]Nebʋ"u' ӌCɴ}P"+^pjLT*D󱆨{cy.U <^3Ʒ*eN S$=UY3_epD4Ev՜o$?nl$OO s|r4׫ F& %(^e Hxj$Z4xfb!_'Q~p<50 Y4a.) P%:YuQT) S"e7csz7lD'Joڪ- i#Fj|OcsSŔ"jS44ɑ yySQ>>cyېp2{ 8(6-g&/٣uRaJ?{Wm$d(ɒ]kS,`HLxyw)q忿]ǐ3=ȕ]F7n{?&τkhsS@b"&dG L^7i׽ylIL2AE Ņ/QdJA+g#fi&1D ɬě[X~ m11)<ڈ5i[v٬7½~03;wR[Sf'3o&?_z3g{Y8-+T V "ةh^ݝ p=+ؾJT3%f뇧_\Ahvny+6M1ѪB|B2m]J0-(~;,<G߽;z{eX_(uRr/H G?~y9i;#NZd ,rw\ >c8zdG_^4G/ CJ pȄ%3>C90$[1pH\U=[\IE&) % F[D9Z&d$!+½ !DO ΡE$ [3pE+lU` Oc/)]X)Z iUTFPr`jhc'I8UuZDIF&BJ!+ e[T[]X&FkE ZpK Tv34^NE9))0an?= 7soO83hŸK4{x&&[?]Dz5} (/i)藟>dz7$ˋ#~Mߜ]M>5-X?\^糀dgH]Ka(}47zR3iE4V7MCa aPmCMl/u ̀!Ht$%[dd01t%l'1- pGbZF].,<+gv pw5$3̾{wC+—VhOĔbpt<ǂ<&OJRFv1j. }lU' (zP : U;3؃ZtDmnUEӱDH$(]QHI t@y|wβ.bd} T@A<?/'NrE{ 7<J3InZ!^*E?I'.Y-X:l~R|)=s uPjS)8,@G{wEPyАcfVv.y@q]*/{ؐ?ZmkV߿\2>m'|J!LjvJdcюEʅHiBqRDٌ^YAB;KWL5b.[fQA*r dŲ1b2hERS*KrN u6zX+E" <ϋQµ),A|C*Ą]7d%fUж QI &/s֖J58E!d˱w5LbGY^-,|9~G2qD9k"K^AiR6fYq͛Tìc'uY1Wk L FžnES5#16GGuUjMN94Ks,LmqX'B pswmZc e;ce h4EBQ5׊:%/p%F(ɦR8d|䈂#ق.,%p$AЩLEHKA$rK錠 1H!_[ׅfXƬL\ z4K(FپTeDm+)vF p5XHi6f]ioV}B `UDwK%ӣɧߜ/2E0,(l J-0L)XS4؂aDbkf-f8HZCskûQ$)uX30Ygؒ׼f`@ lFx%:!0`V<#AQ^G3ΐ+YI) 0NYvFeixMw 8'wG SoHbk 6@rkF$`ZMʬ7:1 2M!QLdZuZ5|,ԌoXW+;o$R ˵]{ˋ=,{] B E3vPj0@7Ya'58 ۪GHW ܣC&1Z﮾\K}?Lxc9p1zfu+V|qecrtyfjsM?Kv麑 rvef~-Uz]/cdGW2kƙTCk嚏n\kdxLs$zVlT^wقim mc3` ;bygLh4U%W3 #Ac p%a-^\+۽BrВWh}9%Z-:>[$so{AjNke̿!y8=_ji.moH5y_{lW]g7y#~#F>(9%Hh|]M dWo2ɫ{yusșf!T_;cӅ߽|pBg%zWq2{tgQie0KKvx]d7"ݼ1;Il3R㴢u)ډSVll݀U3mѺuŠ u>uۢLW2mݺG [ yĩ:Hjaw$X.*.*90'04bM^5P 6Ҁ*ͳ>Oeg9U+oN/ܥ8?87 r翦_Y8Mhs }z/gͽA ?c7}mdUA.Bn_7];CJ_jFjF3u{SLۅaoq+i5bjeOE9.q|s1狌B0$JX_󵍰*6WGp IҴ+@ePs὆"UhhpX(ʩ&Wc$>]HMv Ѝ:AQPPw`uvxIiOۅR MvT H֗9gCA~t_ꝩ2ݚ 2J |evX,J.8@ٗleCpq bWGj4!@:=s \U81\J/4V%7Xd_(]+N#ĭLy;7#ohF7YY-|Lc"}L);jfjfjfjVuB9>*s&Hcr=r` ` e&LtP8ZaIr]A<FRZ.oHR=7*jGs $={X zхv+pp_>zᐐfi1_QȞbOgGJ̽JwI!s16 ZdڋK TǠ#t?}iY5]meҟ&#jyܻwj=՟C42*I؇L_(2 We" ;rń>Gv..&ôXT ZoTٻ6n,WXzٝ*?lgkv68ټLJFҐ3٩o_II&эQQ*Q,l|8͗e/Y?#) O `]ǔ!|1},SUɮ12GPer~ t?D X\b>7~ҁ>7~Jk1L(#*Ydo|}mgs|[.93Nu#*gl$A'$==ԒlpҶ<%_z}*d Bh s4&tɠ'^?;q[.˫u?~~W-+g5ni+ҭdqgӄA;z="t=ONyO06s?;'Y-^/r c$-r0-3?dv> &g=k)s3L@@JEtA!޺oz{g+C~\n/^`q~ ){uWs7v]^^r~^7S~G=ӿ*GMX)kBhX=HJ"C.G)IO;JoJx |9+u A8=ɚoHٟ!%KZKj0k?uw I@1ai? ,n6$!)c5IHW$$CZ`)a8o% n9EnISMk17* )귢}N7`HQȔ]Xvw"P OǴꇭ6#l 6q,E68ꩽ^kb"qG6׾⽡ Tpͱ,0)ψ0iw0 4w%}5I{̠#=SĖq9Cx;aʐWWpkyvzc~HKG1?$5N5Ϯ<ی#s[(<ۀ)Oܶׄ-&LCr׏1QAO Pᘍg8 #(h뽢Ё}zuo!`3mx3]]>e?3Ō怀)FZ$cl~JA@ >1yyY TNX J 7@I'(@oe N񓿹ਊd֒ھrȾM cAofWb6'6N>X|wwsEp1NyԜS`j켏{STUΠ2DG ߱G̘HD-0#9»e㒂k#cf\3ZKGK!FQ]'S_}䮰z-ݎt>eT}k?|V:X#"͓6߲Vg9on/?Oy Jj˫>lX/c8RA;PdT4TgQ!Q0TH԰TP4M7fr3p7,~둮 i5EMG21;졍p7z@9B+cÇfsD\߄.]_k|ͧM;G9^. 3Gfj$k߿UqU& (d|Ӳ-7+_&~\.>nSX椝jk\cK5}0Ii7՛钜6$Z'  ISd"{<O$Xr/ kFaBxYPFrB鵘xt#&pb J4CMXկ2qkft"}x X)  "*sbe$H'qM.4槣UUyyxby?SM!?%Ly[pH5}Lr1$SD:VFR^nۜwmSƜys;qZQ-aTbM\ބ!9Ang*>TߌPYj >C< [z8EJ tj9Sg6` (UhZlu9$wgSȐSF屮d 24XIQɼa`D8fI?6^SI޿x3xP7I$q"tPEN b0m8=Q|^ (: ^)̧GÔBe_ V)=V>O380=mLM%yo|#5/Qб" ?$}ǩX`NQQ ᩴ"8D&GƠ~}T T~+qw*Ȃ2Sr\j&DRkg?|./wM.|5;ه*Œ}!IzVUGl?SHwkqj4 )OK`x.CVB 12 %cTQ!JK+PSCӈ3XB:5 5&r x(Δ3ܵ*^%ZŤIn/rHlv8)ve(lZʻf&<ߑwo??,G,|ݕ_Vӭo*?{sr]Uϫg]T߹כr߹٠Ͼ}pl!SRo>3S"KZ7}qC b)37$Sqtܺz[63ʵc¦,4U֬Pz׳?9DIvikPBgaJb`}A}T~fzTܭ.f՟sŸB]us_KW\|o?|ވ |bL*存4Th {~_71+IGjZC)R$0Y2ʠ1R~amiK?$b,o(dL1'QF ,8b%%q9K S",3 @e`J(/5FcCn̤Vco@cX+SHwKvN)m#uM},&nH8qs >Z\1tb*#%}Y9C;珟$,\I\<熡`\ թr͆, P%B8fDQ=(ܔgJ +D#T,e%i0l~RO}RHc5K0c_&|%Ψ|'Z3@Ř\D:NR= > 9F6-q_z*D9s<9SLKc`(3@@@%c!bOJ #kăbU@e`l,Cj*Ck4II8)%a EamPN"7.i+R5#qZ WXU{Y:\ 7t"xhԋ+h RK${&&"7֣ylx#l ~vb׷hpy"a@]Vey\L8Jn]ykTD\o>ӿj~ssz*r OѬ9Ոhu=ꖋA6 3G9C^ݲu1CS"Ug\~l/T|f enۑG^]dl煱&l/o'%P]2 0rF%#W~vrBEN#%ŇIe 'l$]s?gE!DT9S̵+Ŝ(t9s_TpEgN\ KgA$j4 XT&a' >ZsytK]IIZξg|յ,1Z]!NH,f'x2I*L`{&SZ+c%({^SSP9*&=}{QҊ*O z[ -L(%%!U8ѤNkO[`!, K_-cʐ+5طY]Ni7@.tMFYx%Fx'5NFYU4JFI hBȾ:QH:$m86 8u6gG#0xIB$%,$;- P?kJ,i +lKZ\bI\ PTypQ8)'vT G" z-'s g_WL!C(жPvVN!!?>2xN#CP.f|f)PX]iE:hH4DQa -z#S:5p# DC9 b(c'NJb^W5&F):,`ڒʟ rRkxD0E曔LF=D8L \s70Ilq"_ iE+&Tde2 ]2t Ef/tA% şOt!eQǹ(*gRaJfEr܎ vzⰍ <;.@/.dߟp~*`q~ v=;u,[@Q*qvUmAW) ^ /1LH8T^Pf`!WA-  +F"K a%QLVs;Rs;o=Iy|v t];)} O)h Ӯ > WX(AsKUR;c #!dW` *XxAFp XyCAx%e/tIj U99dIg łhj 8kJ#E&VUUV QzٹAo_N\7M\S3j^ S?58-/Ω$ͪu}nN8S9 GyVU)lX4y3he,1RVٻ6n%WTz%/);gٸjܒ%߷ARDE  -&#F7nᲮJ69nx 3Lcީ浶7U4 f[b\o)O»D2v~;W>[K]PoX0%9,rS-\!8'X ز׷*RPu$/*Ҵ>CkN$A/dt`-o]'N Vz!l dqS?= kDse@ @nRa 3`J8WQǻ=|'7^68=?ϋ%&荒_7nq/3e0`wSu/σ[?? gttcli+u{*qN"qT)^6UJWA}J# J kF$.% cEuٻB*l%Ky[;7f;fX $4=D/ B)RuUJ|$*I(oyd4hۿ82rcXډ,J`f,&JX;!FQ 6g:9Z)$p 0oj'P iae8mFy3$͜[-V. }ly0jDt -,;!>"cho7̺bK .ݛƈ4C7BOzyH;1^% db4M5QX$& 8o%Fno]|T㭫-wVR>uWT P؛.E]1du. P[CkXonwY t ?YV1#3"\̝0e2HI$ucFQCIGKFW R5ʝtBM9#ȝt"$$0R3'!3篇KJ)i=MlV(BrʶʊNv1&%G)yJHOى.zӖ=z`"oǛ*w_e_f_Grf lFgS@mHVGyIQF$sc/uCk@x)_wB!ۚ>$k|'k|*LֆmM  d 7#$IM* F ;`UD"#<>aI-?/!f<#KLM!yUb]⁣~u{(J22{ϼW~3U8 '*m,ue l1\0R4M GaBJs0AZh$ f/TBc7J LvJ3!-L5>V6N q[p!'S=& 0Bul%J#d- GC5%RkKRw2RU Eѧ5}` &Y蘠{.+=auvjGz(ְl?5HG-XbcE~ڈw-7Uyls&"F\ Ԏ=kkXotW>RADu Lޝc-♨l81 l=16Qf`-Vʬ@LIMQMn\^ ֦PJ͑C)58UucaQn5 B+|(;&vv n4 gdrX ~[ߝ]juO#)Za \UmvYQ xonfjE vbFS[52UHs) 5B"-!$98fWo=fw +U(t#"\hV[e`)ofc`P X .M%+! s:3劃y ʲ?+qA W,.hߒ"?TΡPq*XQyTC5Ps$R9wa!*>@7;#YIyCl55BK&J`5 5 u4ư)fQ^r$ffYdE+mnծrf&S)Apm!ꅵZ**,A5bR, 3D W.JݫIA^fkXfS]b!FB(OC+KE֭3ڰX,o՗P|8깙M-nj)ᨷ'Gcb^ O>ݗŷ`;fh.gͳ"12T+66qu쨋w~zKݯ8vm)XL[fS[OWG-=uT罫YwА~BTYս+]J^*&o8q @IG"6W֕-r8.1clyŘ ;G]vÖk1D^1y1"Mu Ș-0;{3F V%l«[Pqk_u\f݊C~\ͭ]:zNP->Ï~I4R .M"}{n{C=F/ق zEO5/q+_sGhV>=#֊NP]bdJPtѨ$L1;zqV)^+1"֜ո}iJܖoj.I]33/hM=wbi0ոVH='/H[/yL$˾5%Z%rK%/1VH^mOA(&`ˢjz1\$8pt]':B@D]2h&A箺 LӰx'~u0>"Px,K88LWެ Ѧ,̨ky%LHh=tތ,dȒl!v,@1k"@M[pH@)p pJb]3ΘXD-D;M iurVK0uJsv0gS Őy~å>R臗Xii8XDot̛pԌ/yXaS"Aվ #0_ XOQKWРpW_ܗ7oi}yf>~d=Be;|2(=lvaZGTId`M.-J ;8upoQFPujOJzh 8󶻏{l5҄nm<5WDZX MC8↭xg 1Fe٦GiԴOॎUiK`NBEP:c(a3FW#y!(N/=ν&E\Qp49'Aq圍b S1>Kz [:(XB^Ty%MFBnDsH|F8Nz8/'r=ܦԒi P}X/D;,gB@j;gLT~Yb5W=RT~V  ϽW]Id!tOeM)$>Ոԥۊ;- R`S\}Rr1+*A S,y4/>w0O ׷(hBWΐqSV\<oӠ]SOY`mï6XRߞ 0wOÔY._{BRTة(ߎ46 eSX2t`[l@DvSB۳=#hkB&)* J2k EqOctKߏS_L<ܯX9k--IW$%[b--s"KQ9͗kWwg˯X_xf;]Uo/Vi}~˘;\}Jl$ɼ1w$$<: jȘi F8jP7)˛JJ)*y75cۃ V9"ņH"UmWzJG|8(9jja5W!U:m􃏈s,pA8RYT) s-'F )#,LԵ3u 1l ZB e@qwn̷ `XȷG>h}_/||o Bkxσ [GZT_DI;~ǻf:/ְ|z{Bo߹拿NosFg"E2s"3{{d6.[H~E,QR% 03ZWdHV}5着eA8*ˣW)j|15Yf ~×*#L !5 )tBaYͫ 2saaN$hi9e%̵F`{P*tb %_[|_]':I/ZƮ4,++u cD1݈xe NHfJ~us>[$Oc{mScqe#EUULIFoqܗR,炃5OlI]gnS ]ߧ+Kk>*bF̴<*ֳMXڌ8"a&We2)./MSBR`ڝ1˝%Ao( gع3ؚeN/]uF{H@m?`u64V'ԭ_-vN:*xoWy7=>vSʼnӡZv` 5DI@`:05YwrX)|T ZBmzaan}J ї0%!3 5 6΍HLY0 AcJI*RRȭZfޝDH mLЉ Us$<5122>/HOm<U =T Ý."L2[AG>۩CuśX P}5hk3[T e-G_,ЄL熎U=@C:恲 H W.݄̓)eȭ?K~8r0#@X՞Po?/" d>^ !1Dolo~F8}m r&Xk;uj|]ˠwi C(hCfd֦ ܌ӟ6Ʊ{pa%O'4rG{ |{4ww&+TT){l_*1&3%\|9*4!'#4mJ}=۹6˄xL<%&LcdXm[n[H Mƕx,?¯~Ghpzf-@x_<_vrfsi1pzTS>0=n+յXZk{)VV,Ly7Gysk:his*,[QFzӡCk*X$\de0:fT6ԆY64ь,:hAgPY50"n22"Y7j4|h[rYZlF'F,5#?%5I3iNiQ )rbcST6ɤ IN"ڏ89''|5W6%"W-j@˹sXE(t5(t؀)t@!Xs`<sT8te &L" k ?G<L)L(l=1ł!؜d gK:XrS@ hZOd* Ƥ)3 k%͉0F81aǔƉFqldCXDkȨM#rEJ ʍe!&QO;vFj+a`ocҲOG'0-FPhE)yH)ӊ3e/Ѿ²@*) SJa*Bb Op&g$& >g;AJnj ;x~Azx0g\mbK6/n;^p =kx -./1,''W0WlWtk;_]OAg0$׷~~_ҝ4|E0\.Lv *_&NB t l-аp1JZL+yHcEL !sE &yxRp[?VF=|+ aeAW=4XBjL h$.XAWoj 9TN`ɘ<464\\.&/W / DgZLbٗ# Ȁa|܀hƊ4U^pi1Ss,M1s,M,ΐRX6FZ$#.З:7"ʙ|: n!_?v"Oxyq|ệh@Ak[,ݧp +4 />pTWiCz:[u;CF9|Op*#WcöaŠbW>b@b($Ztj]c찬KVp{tc\kh1i[%,HZ'Hx~ mqVA]ꮝ *fTT@ΕCRޡ;^G;W` PT&5g0q䧱t}.*SNZES:=kkJ˵8k.5eٛ}!;՝9Sv v~{3ZǾ) 1]VPGuo[fu^9Z-Xo^g."w6KWY~e~.u:xby;08wUŗ-wM~1zYz,!|ƌ)9IC=9nu dĶQǺwf:UwfÕ[#ѓǔRg̕ J#;#J 2;Bv$ڳ#ʶ6>Ջam'qqL+ȧitLQ0o6Bd諷߿,3[v+w ^y_{Qtzi◨‹h@I1`iRCe0OpDQ WeԓDQ#UoO1Z5 iU|E6}mW2Vʬ^ytWU S%ܯ\(bbN@v{p*OwX;'Vmn<[../Ux%Fp9섬 .VNprnG.'oUPjLj`S;-^W,n/uHsǘ~*lHSʚ1$$!.`'6!X BuiO 9Rc0{'ˌB\Ūt'81hrt1:Y_*Y_6+~λpKd%<_JWUY~Wn<+%? /\AlgXQiv0R~٤Kz_M_ %HbYJ#!L-$ w/"LΈ>OQ-6 9g=M!^Ġ1|HLsYmaKw:W " #~(Pp'$_uԣv>f3]Owr1bRĠQǪ:bFMPBC:մs MgM ^o^n(kn~j4گ]P{%xǗwˋbn1Ч@ϧ N)lU Ri;TZ4n5 FDzPR}>!}*//6Vjz =AƓWDrj*N-*WǪ;_)ǡz`e;嬤&V=dcgbuj׵js6냟 N'#jv_.('ڏk=wkNZ ĔQ5oWMW9wjts,pFxBY &܂ӝaI"XL!+8WH@NWD<.oѯws&Ly+{cUbmH&JscWZXtf,QN5I*,9B)1JJc"& 4@jFHnAs`AP r өꎺ7~$lb]j8Ë q yAxL *B\5Dl5Qx^sWA@W1\9%UWqU<;Y|=qUAՁF$B>̜;$$XqJЂ$ DKpG sb%( eY$I)5) G ,&nOMK&֛gś  ,%{|P ye-wy/75n~ EY #D+peC =sMɥ}ZWTk[ 6".iSM`]_qfiƱ c%Aqj$\d\κ-L!^TSD}yj`kvw 8 ˎ_!,?Uc`rhUO޲h`('FOldj@8^V}ԵiҬ$f'W ϻb9ؒHkҜ)76L5bM9 CV$\Zm$^OqV!tQ-,[W+O5L?ZD+h+xD/j6FHY]T9(3 u/7kCX%14@! Z-*jpscRUiDGwp䛈 贒OٕtcQJy)=PC^԰EƖXcՠp%v]d`.7'lwF^g&^jėg5#jI0>`\\%TG sdkD:KzzmS)M1Bi-j+J-7IeD7~Zwv eݮ⊫8\o1,Uyq/g1r _PRX}X{0XL'GCC{0l&Et`w_3ڱׂ6X {QpJ8ouDKJ{ْ5.%':ݨβrӋtXFU*º vL{NZ.L^VbIL[Qp 7]O1GvݐenS}qg\8/&r=ɗ֓@سO!"PTD 1ȃV+Ottw'-zƸ_Q6rdYU@Ak*9*D&#AdʥKJJ#/Yŋč/UC_Gd wS %" HJ Tf_VS&doci h^,khZg{A+[iЪH6ϧ]nyf,2ovTJSa1'Fsc²ЮGTN<=rAWZی!3gBHϼ.j K)/َ^QǬ˸`j+JbNiSgƤb,c-l2[vF?m`~n:,+wUUa~Z˩ک'NW;9"ꄐm_>)y.5N%LQ16N3=mӉULRB' W\($Jf Dp)o'P}p'$8w>׊rcN*GIC<s" X{6;D XLC;֚q@0fTYunY( d`1!nN^IK7M5~9[K.|k*/\f擻Ixd9 r$.T:=N>oaRZKTN[$N@5iEH'VQwu|v:췧iċ!DmݵnA*؁[qh s̢?Wsr@wRߌt:ˬc^Dq,«ѓcPT3vdaTי%zn5tNhL%EQD Ѫʊ{IܼMYX)WJPc;.70n Y8LłZ56X8ݾڭdsγ%r\ -Q 2s*Ŵ>ǔ0EliΘ&M+JvY9n?{T*ꔑdJV% %R ߉:CV6X*cx\xB kѝbiAҒ[3Vwǣ& )j{A%g.NxLcэ60ʕ~m@z3ttT(ڊ\.},FIV)h@-҄KL3|1a,X>3lV81ߩ8PʁBif\n98?wJQfB e&ZVE^ * + raV{_뒢&0NE bE*Q7K Z@r&0K "=qbr$Rm'BVQ_XApPA# Ns@Rǃb a3"@9oY`)rO\a cl0:-P>GKsQ^rdl4ubq`u<2̍Yaͱ8蚌-Xþ8jG As]iBGoE Wm6* /?@%09/fPR܆*1QՁta:me:Zj`_Kւ`oݚj;W֤>m!C:miP1as>(R֭̾¢++_JV\W<,,EMMtwFH+;Cz}@Tg_ʹM+4 uq飫.w r-luF^" Ҹ#ԝ~)+#)Qڼ[Ʈ"`k*K4Ţ;V'EYK~+2TXkG!l;)ϳ, =@z燻V zkd7+\nh5O3'qKYtܫT#xl㒛;xi{~i>jֽ"D"#V0u綗Q?M/~IrHkP{ý,~wV;Qi*YNOV'_;$"fl> BJт"rS./zINI=C{rwҺi߽ث+]Wз״\qp,8Īgc=33#-Ys7Yc&}tiV=\m9b^i4"vk4 }El(jTUd/׿8~o,'9nHd!ufmfbr~'|U*Bz!*C!61dqbcV̏&MkD}QH Q0Jld`kgr“kgfNއ8jAS=|ai!Z5A@%J6f]\y#Sx$&TRmcDaL )GYRY-7.o@5EbM=m>w4i*hrwW7ff>'͠2NF[Yl 7R6Xo!2_,&g7E &.VJG6sp\8:t.LL*6TaWBIؠ1B=Iʃ/srV'&Z`ǙI'WR=iH&j6@uXg9%&FLn2vbyhͥqV"-f\$qa;FzOu@I9\JzPnF_aigsft~`0gO|* a2jMUu"Q:$+ 8ќK<-_Xԫ\*nA`}m&+x_К}TBf|GR%YW]?'amzEUKBX˨~Ǝ"um^MA-7}3EIm|/Wk~m7)W3:x@[c\T`ڿ6I'RRS B:mPA0.r`CmbnbȺȺȺ2)vHJb;O ɭ# h) Nindڏ[@^Et.F1~1 j籛h(_UJ:lwM/?Xi-q)IU֕Ad{_ Vgz:iNibR6];hD!8;␃9qSȒDeй!hCD7QRm,jUY:HSRi1+ * Ib=f oc&`<5I1 |Vrg85f4t뵘OV Ɯ1qv) F-9OKxyɋ1BL;MvOP_x']Rm cǑȄ,0K2̔ u s%YIyoO{M?&C5w)!XA ^aIT#Ej@Ljq0u;^={2@in!:Hs:r4t萆q!/@)G;gCĔH[MbCuj'*ߊaˑ쾺!zBʔi2ILR):prW5:J:~,j~ i؎ЁJ%\t &5'Ր\+՘{TUcB *'Z.[[LsJ[(&f=zuc^;CqMP-N($Wa[MLd"&rf\浮s)Inr)Drp9yE,rޕ$ٿBL/c@mOtYy1!)u߾EJ*YG0nْ$drhND;l`_MN=mI g5>t=v7~$E}qpѝx1!V2~MetBvɊ1:y;|Q/dzLi؃!Z%EGL6Z;brED{,[[ Cl#@r™f X&8)eu Z:'ցiJ8:n\*󥻛XĹ6.(A_E{SR2tFn*G.)]$?r}$~ɠ?)m7=+RϪ-TB\+mv gH=K{ŕ5w7/$;=UXInvߝn)) qvEj]ҹ|.!|_.B|NYo,^Y+四uia9{?ܢp}2?HgJ8'yX8[:u [ɀF0ӌ@n2˘49677cp榈켸E!*;hhKB4yFq&#ðရW $9ʏH?~&oZB4$(+,R =PB\(#2!'#(AFa=Iܬ/77{Fou7e:0Y`ЩBja F oBEs3-fFkQKr ~'1h'A;j[V=ը \ivP,^Qi3v/eXο{_X_{9Mj}/M'n="B@p Et(~CJcHH2I0l bV1 YkA~/r\*3ҫ4|Q9uPfs-5,BS @kz z4-ؿ75[!0U>ѷ/ RC3MS軧¬3u^N|&.K ep[J ]VqR 9mGej?1@~ǑX6F 8GLSRN9#xjw'*E&BRhlh^4Oxr(M\5hiX& S #hP&/ /dxzCzKQkB9kِi5w~(h{;Y;Z sa?;6Z%,eSO<&z}--Q:j.8{$$csi¨VI{5;8Ը8V((j)*BQC,+SAbA 7O8`{D.N*F.Ko}Ɯ2xRPbuu +"ƣP^-irvh&Q8pe_0xsᕀ_7&u|)jV#ϊĶTp?@CptHzlc~$l*mn>Ns%~RTyG&粱0qr=ÇHzzwlF 者YQpS)*Z u@+])YWzP֏z*@Uh U۞$WոA\WBжUQE_ So 7;߼w$8?N!",GGŽJ^Z88>,+E0Aaճp[78$5cOto>|r[%R讬w ޫ;q-(-=sNa=x I !0$&)5 (KȹAz;")a#q\$١U) *?*7UQJ4z/ٞa=}2եPGvv%1xҥ\r,rw[Q?|pvξgu]1&J2aW! _XNz[1 B1?x!byU>HI_9st{P#^(,+\yon3:3 O3V' \U}yv1nntv<򈣺Jwe7m@ (.U"~VTk"mߚ&_>߿/F͔oE[>}fooot{_B` ]U|@{ M]sN铌c;XivW{n'ۻ+Vʋ٣NG#Pک8P:dH+T$Mhn䥏BT jDXBBRK+psJ&w\QDOBKSPS5ȒiܻFFRENe*S2D}Y~}tTeՏyF5Y0^hKABeay/5_yOObe2KٺR0{ **%R( {()Ms͒M@N8h$1!Zfz:VSa8ԾU_HC]0Sf}Z܁hM$ԸSx4 ~m8Ӟ@M-Fz[$a4+Jɴi=4B7ldL08(ɔi_3Lx)B&l&Ɲz@ǫmA{: 0=+ 8?9u5W9P>zAU(\*酼 "2¬4*Xkz#s#TAU 3ҳ^ ިG% *rHuWܒ<c_ R}Ķ۔jeñRv$١ІgLOTۧ,P1@"љ$I"j3 `H2vh2V!p rl/mL8&欤IE'PH30?ѩo`D'\=9hNFgwqh@23CfƲHKgtUX͇[{mw P AFԙ$ 0!*~َbPNֽZ.ӓZIl{Ob/~Ȼ- ]?%`O:\yz{ob'=HW s<l0t*ЯM6DQr*H:M쎄mYe.qAl|w5/*@T@ T>SFNp:o8cod):gFɎK.-w.^aHvRx?SZ!IZFGzPҡ 0cc)k %C'uoD}ԽJo$ PjkPfS|=-#g-#}==;y*T>CvW oȫtH-DJ!򘦎"<)| .8VJfdpp(8i[5ڌuSY׌p-<Nh~`qU>Ǣ {;rj} }Зub0<1vUպ~}ARv f2 UFLYm3b#`L KT~Li=$qtZY|8; + rJ:K+?nyI8F!"pߞ2 Î\zEpλJ]\e=|jg Ϟ޽Z7qNbYwΜ Jtӝ_=5v8"ۦXѥmi;ZERnTtVt>V„9XDvyt!) 'WϼөC 8PVro䜡s~MO:V5_oy_n^5~o3_6Ay47 0n@[!5t("!#M:b0EF; Q#45TZ61/oB% 'CZ ?5w|5w#U:|9-t)L|'~}~Ɯ2(sxd 7Jn0RPF c96909*r ?4B.)4@N8|9Lu K!e@N:c%3S u9?e<# "?F~xhWM5>k\"P0oF3Őa\ lEI.KV _7 o!+H*p&X~웖.$2 Mf- 8ٌqN޵$/9~IfpY$Hffم$.zD%;~Քle5ŇHcb[|TU]]]&|m6PŨD!uK{uβPøe׽T:ى˒;rh'oWxǩG(Oь&PԵ4YCX&YO@R"%zEc$Q*7MR Lʙdy^K!=Q,{ZrS FXI" :ydW#K!:oy'Ӆxg<5hzW2ERR8qb@g3M#:+[\>d:@45]g:SpB{V *zRo|ΊQ){QK\7I,S=̦NJRpÝ0reKUhX!zSJW:V. ()fceBB8))rq53@0Rbe$ 5ķ\cCqEsО:;>dvndz ]b8S;N٘dUW:o0+Wf= a/A.OR\ܦf8Zm0]fev;fÇ,\_c"oQM (Ybi>P;\8<5N0&nkgxW<vN7GGs;xhAjOȢ Dѐ+Bmi*el%! [J],Gf]p]h/}!IO+1FRGk&NߊRDH6NY0jYH`8q!fHG"HaŹLU%*{䌼/7? Wr=: *AyA>mJ -{2e؈ a( JN)2I`!SS)~hH^>x:jT>ČAVMg_PhkCWf(&t vc11^prXF[ǂb]ׁ'K[$f$L2&I-E M~vȈ.;~|^$v͍#Y3r:H \(Nߑwo?.Xo/ѯCdf<DDpV+7paE˜}v0ɿ"i|Lz'`lNz271cJH-9}|u| ;b,kƢPܦyl_iV0D ƂK=WZ2%8QW S#c; i[35xV_rP H$le="0BDjXS =8b +ZM'쪱<(e z^>c*LÜQ:R"JDpae 9xDI"qDJ+Q|>NJhN+ \ MXLE jS5a11ZG&>&BZ k(NH_6ff{ ^˷KVTwRc8l:no=]HK;Ef 5JTyUolnTyt^SɩU벜^wS4bSbվ5̻SQ5̫hH#+ mǃ>e/kmб  󮫻V8U,^Y(XRigS61Hv\jL^%% ,*( kWw>pdIt:*u=IYDU~Ҷ.=dZT:J,VuU"Sp{iEp`ꃚ [Lk3N~> I?; *$[bB5?*41#w09U}FVF <-^k@9I Э|u/1ժ%l>_Q 3䌨zrktqdo T ' i`Fø0m=F+zqtP:wxW(k)&6Q\I:JU |kБk H6}߆t JPQ4T{/>\bA lb#$:G @*GQZr=e:\]뇚(HTީ’K;Wa ]IYT9ݩ ;Wߖ]+6ȹ0a]ղc6"uU<=!ۭ 뻉c,C)vK^ƷJu&8X.D_I9§GJH"pKExd&)!RX#1(Z\I h d\ubF<@R%e* G$(J'M#cј8N)BK<ןZ|);܏vɂW_,1 |jABD ȶg4"YMt܄m@ٳ>-qIV–^:V D7Ϫݣ k+N{՘IǡB$$:OG6^/R7.fKGJ|)h|:K S{%#IDƣ!x'khEfPRGZ5 eúI2dH)0mYʦ_$}Aj~T{Bǐu7M2LlVs!1Xwf;o0q戮"ZmKoŜsB0RNniܹcFR,Co>q! %n;[OyݡKhV}3EK ;qB*!Xt7yyDaxbhտ[xb;NG!Rww y5r2ԥ ׃<=x؞zpͺ#^#}`GNpaxW굹WO{k~g@b^:!&1bUUJ׈Җ#$}Ǵ-/ݠm:oN~I}o[? Mn K Io {ͷ8@aܷ 2)Ѷw9l%o 6=3 x=y~R.zfʩ/>A+FtØ^fx`KS3"ؗ Uٛ/"6Kjpq~9آyq%&w?cXƣt+ v5 5/p몒"Zؿq3T!y{]Ѵl6ڋ&eOhxp Gfd rL糇gHYkf3 õxXʩix'h < QpAF,FqGJ(ꔦXJ$ )&x7C_JWhPE7#T\.kooY {~f0 (H}BJG,,a̤T[9Mxb8Jؤ$F%s O"VzUARBJ㌼@ TdGY,9rq;<2Vl5 qDٜXEql$!4H3P{ԂOYN1FՀ GpF<6)Y1$TKk RK)&tWgj+hAL1e1OyEHfyaa b,fp'ؘivz@N=K.ؗ;G8es/Wlڎ}« Y=.!ǻn}lw #ހw:9Yf3B81Wۯo#b|;@&.32dxgr2띀%9L R)폘1vDq*GcCؕ1Zg #vie'ExY866 s<,\eE(?&|cb0? D0L /[s{?\doZve'r2-ۿ*u]"Llm@A'^Txn\ 72<1*1R 1}ԄB~DxxӞ2[>H poc d)v 8n2';P Ou0]mzy[}֯\v[Ʊ7ޢZn/ȶwv_C/? B\*ٖLDЀ-8⽥1ץ&3>=U~+Xkz ]>N:]s>_׀D ͉vAfy?x;z Iyk0OD/0՚IKT/3;_+M5Ӕ~pd).1p|M *VOynFKF":|s=5==Q=\-݌tK֣c:#ٿBChyVslCDӇ5E2$e[# c;@DgzF b7ft!Sml=WHߏ8V m;ATp~J9`i>l3ݏb+|ڲ[N[v-ЖJq-Kl9bpvEV^6ct{J|x%߿΃u{:Nv/Eܑ;bOGYP"uKm*ӊ#Epǘ` c)8T IPcXs!Jh2I~2s; PrlAGƤ0Tp*!q +) R|%PF# wKH\ u/ujJ^1闛 cص98Sc& FR+C DeFLr:t1}h Q\gOtP5 wH 1ytHr^!xJ`£Y#QTQ+:PUG B@<;/SS6@L/Z32VIdc7vŵuF9fS@SΩC ]P\'VG*k`GھaX_5B8 No( A0H,B P2UeR3)V': +Ha%O{`Ci/jr\hnSC=f)s4#m4hB C,E5׃:(,E`X0 hm'B{4(Gʦ4X@};ld#_e+Eȗ/ȗlXj||owz~p3hU>ƛ ]Bu~fW3 Av|ԁ2v Yyg%[gMg6IL^kԸm`Ye'sp91K | £T:\:,y++[0фY(vy>Ak*`Ŝs/Kݾ{Y l9|`CD7?(WBsh,3?B~c$x!\񪧭xݽ#EM%XMB$B6M Z eE0…eojiՖ8})>0)-U%eRC#֒{wq-bd3qrY} t1D[}lͰD0犞;[E {=H\,՝~z|^Zj(Ho'u!~F~=_kaa1j<0d9i+#@nj(.Q*&斴br3"HR`m]jkB-(rqefIjH!TinIW;T=#V(.3s5 z׍q>4sO}WFc.ƫǨSZn{7ƸzufkIk>\,Bw %pE@ӂ ,](;yT1JnbDv%3K+)| qW΢xtmQIzڭ.ʰNwTn{Y;n'ZVmCr㩒UeL6Ab7]UVj#*%+8n2a_KMPK*?.ZIgP?f^(;#?L_3`I /jt~$dxcRNIҘj<7h'WQu|l"ʛ*6hM0(> #sf&$ۧ5LYA HR=D#KHS4aċD^P]aADl4uh]b3Ƥ`v#1%BJO )\fnL/䋜nﻺ+].@Q,i/R%, J"NpOWo,WSc5ݹ@5w:qU3jüeCQBVYxXZbiamW{(ܦ /qJ27TkY}m5hsui;𗐂_6FCq)4ڻTUI#q 7Fqop[?l gfz RPKk轢 ɮp?f|eaO_r=M07Q]Gbvpx0L=8B %)kV199Rg!3#/~1@:(OP~EAcXP͏%,t92FHQ&dƭQ"Xe^ U@dwԛ/#F CKs;:*EwGب6\>š0; f/Hgu5I ^coƏ뢲xr*c?=-0$["7ù;**DoVrbKfCR|]dEB?z_^M2CºfM"BJ0ߴD~sN7J"ʊ'lDߜ]QOq]WDb:a ."+\Yx+A(qpĪc!1B@O%$8KoAu_T׺///CƶHy%&R͵X ~et]|,q?OJ`5הBe/AhHp$F A30be'\Ab1HRxB*Sk DJc 4uTc덑J,GZ[ 2IqSlw[I<=w[s,K 7".>{1+>{5ZVF Rp4q@B;Ñ`" DP+,y{ǣ>(b3cHMPF%BJfH9lHI!qxE"i5pc ha1)*4!PpS,ia5o[7'sh\\C 0OB%x+5 BTs(ˢ j lPNb%(1]JV:*8qލ{nJT )0X& F'X"4'.&|a0V#Сt(^y>Vw8o 7vq:mPi͟ ܞ+@F|4`z$(:MQ`ْB-1Mh+Ƙo"7iٴT:a2Ql{3Do),m}YX>llKfLh_Φ>'OycnlT^vdphEtXUT-Z=w+\Ŕ#ʿ{o$4x6ߪNejFѳ5ֵ~dp9l>oI67ou=pu7b"^|tWOWo_ӳ^}oGٙ-mw!O{˓~xT]Ҹ t?݀@ /o[(f#3|?ldev 8*vh{ HAGck7&DXNuTfNͧ^3r躿8f3|٘K~n&oΤ 8"p) eqk&8˿ W݉:_cUKuqxo~8WOcK348mw̍vDaIa""QMD#J@PzpH>vD|?\rrvmEA^by ]bL_?#w [Og >u{4%ĉw,\XczOg{,.PzW<,&Eus 3Gsԛִk!Iz}sf(zș!2?nAG\=4Lo=k l µ4qqc9ygs\3G,rͮNC٥~v7{ӸwbGr'?eozmt>[y=ʊ!j}7w雴݁dⷻaO^ffݥwO_7^#|V7ݓ;`{<.(W_gި}F aps t>N_u:ײ=6#Mv>0Ac?\kގ"c> t),»ހ:h8 kNJjM $L)u~_?& [K| N¹hy'[#UvW3uYhq!q߀Qu5_:]?!;9`eI`e $4[#>eej{$q_''IJ*5,U |,MEhvRDY7A{[dEMʨoYE#4^c6R*E)N T[EIʥD I~U*(6uRHI$tEBɦF|BW&->k |4bv In82)F&Bj-)-K5JxcnлtPIJphM+hVRq0`|@1y &or`[|)[[_PIV?{WǍ_9F$H "lM6mmdIF$#{FRϛD۴ȭb=*O1.TYn-lՌX3[?V1V!)J]҆Pt$09WgdDg| AU2,%]T @"p*3,1 @l\X%upJja^ vbE{U28~Qg-9 ~? y䖷ZDfh'1!D"QXPXXWznlDI!02ڕn/54v^+om$8x*AiAXVyAe|!œfB&Dy` a=oak$=G"?[+P -{Ka7\~\?\w~sb'IFǿuRF6jmWNgo}jɨo5oÖ7SV!|G4<ߪys&[T7op]%Hhk_g_:Ͼ{w~ ukW]B51恫αbM婌x$hVWT+i(a c5OnʻxʬҫYs}0s7y5^C|nt/ <^tg}i AR~Q:/J&X}xO @uN Ӈơaҕ^mW(-c]&&N8S>x2ƠDnb@,@(KVI,IpIWRbXi+xZ6ἥ}QUs*39Ȃu9UHf@QCIh!HT,ޫⵋͷ?VńxSسeYTuQ!}q^`nW4էObۛH%%B9L5]rI>-َ9h/zT &B=4$:ӌ;k`a1xYOkIT> Sԫi: F&`r:ʔ陞cHYޔӿJ0V %tnsMy7Zo|S˧Gu}3aSq~/OǞyF)DڋS3:O 32@5V'3&שrߔ ^aPMi $a//;Py*Bg.>BǬCjLέZ~1{PE"gM)RI ]lcP 0JSR*xUbuQhDr㿮ӭM됒)לƥUe J"qV ^p(t*eI.xx#r$O`L~C(ܖ!Q0 U ~Ʀ#~i6Z\&ShPʍg([R&ky; iāCAR$, i\ 4"$퇰А<&B;(Q~dw8H NZ!ąa~65Cm,* *'|ui@=߸O?b#g3V<$'M OhGmQ~ ['@gy)S$ @;3hH==t 2Lz&h`#e " ?<}ʠFv 9&GTB^)6Q.yZHd ҂*D#j8w$}HliSA9H%{/b$=W?;PFRvsR2 D]I[ #{^K%(dUJTel˻D&6d ֮^iNwcJP9,4Z'瞂UUՆD 9o}caE, b0  yxm Dz^VYzd!a(&@X:0<~8Ký[Q!a$b/b.BA$.+m {!7GP)b+,d-3JE߯]qd;ܝq]*ls6w|\\RT8K.ԸEz.M2KL8 &[\)E!pau|evwȰF:ƃ ; Κu[,c^cDCP@Fx G9L\K>:,Zx'ŏ-91)xM X'=g Z”Ⱦmsi cJE杢ʉT9"=%Պ[h4#\{dQ-FO9\us_u"K }3U(amZw yUl­ȫw- vH8!n2ŇW֧6:i-5 =:nJ MaXkXhV .lŬvX |}OΫx˗nj^ /gZ-ƙ݅ Kյ/v?_ެ|fbAo,%z_Ga5tOY?Hv_n>`qG}EzuYZr]h@d!vkqz˻!k:w몃vWڻu?{D;ɯ8[[WNoxC"Pͻug8$z>,37 J<.aNM鹑YZFf)oȌ*e oбԳ^="rQT5A}P][o[G+^vfg[uWì  A2Oڊ)&)ίCJ$:9DI`;aKTWgɓ`s9[3[q-K_N>ɼlx^x[kB)`l<;_zwýղw'v{?Fo&~Z㏥&@$SpΧ7zW,얳jq7P/e?m;&%"wf=}3'p:p:sRpE& WD}&:lP|(\8՞`- jQ(TEKPzm>@ּhvisCS'u^ߜX*VѱYLT𠹩U6jkl/>Yr#H k5\#͋8}vx[r?xy f#(6ɮؗ+/i|{usO'uyv'JO+`! {}[u$$! J \+{gLJً}zi*.$j~k43WH /ⴔ,xQ:tgQŧS]lOQ`>$ 5MN6Őf'O50Y,QԿ3VB%ˍI*q>x%$-`Ѫ?,gy[6@$DQ(ѱ~0 mscW'xc >t(,.w/_7SKC6i>9w ]͝IVÏ7ʡT2|OdCFn>$7?y={>g4 Zǧ7'r6B骙}3)~zp~7Aa d0.nYD$uAWwWV%lz9Nj`=1'8Ǔ:v.ػnQ5ɝ5o5>7$VNrT~G"fU2q Ը8h6tAWz܍u +KUKĪ-0`vovi xnZh=QYpX9htLBmfSw$}_x2%8s1-VDnIT_FNg(2 Iѿ=;`:&\zKu<#%T.-' KV"ёx)EbN,x[ BQy;mVxLJEI"K&^'XYIJ%HLf^;L&@5X(0t n' -͒5Ӡ"ۆ}Ks>)i5r79DwZoDRBE$@φϦ"b\mq䰤*JnknLs}k7yڂ j:@X> HyŒި][ x%{\',  û;ˎޙ*//*}<;2-`SNô jB4E ~c k&+] "$(uz-tJ- ڬ!8ډxVL<)LkEЃR,j@=rp ނ)^$Lh[R#ķXxA9 A?0@mRn' ǖ#|sO^"_&2.MwA>]|o]@z_l,2o򩳋i(9?DP*~.w`? J~uI;HH* ,ǔӵjyђ"eK<s&eK&Wz/ .[}=AD)d`z i=OA%#M-Vx<녡(ӆ#xB25OV*f^wwX+bi,9g5I񔣷…ķYa#^s6K6bcovg87p^ . JK9֟uWDzH 8lYdM2P)Xa6;zWDu gH[g SmkQJ(j/ :5p@gJ)dZÝq  !3%pLT(Zny-əJjC@ Y4IT9C6@їrQK9t*M.JUD ZI+R1 k!bw11(D^#f>'F;MQ/Q++P|d֊D[5⤧5*&&OQ(J&Iq4*Tп{j+"K hQNt+˝+$}JR)Ut@ E:am7Gad?~ `QdQeHF95RnɅ`SJ19D=H%:ۜLVhB"h(EnGqnݘW RI^er(Y J:%x*.%O^{WI:A$Z*ڏ$4iN 59̍ѐoQ4;KIXRGu&]H؞"[0<zҞD"Fb6%:+ 9mv==zeq od|8mSlئTMI"BQhEA޹4hdRDIAG4J$)8ڜFtSolIÐTjK|t?(P 'az{H.ھ;I}=ԣk51a:yH9*E>{Âf_|y:z9{XU܆.]"}ZY%S:IEE8gޯr;|xbp^=0NB퍗(@WےFv#r¹"6w K_/ֿ-ogh3O+IJ6Y YIA ]4.o"뇰muQCBfh!nVd/"3WK@ %iFm> (pWLݔCob<>D7ŦuAFL&tct?Sy䨴Q C[hv9*2w.iTH.l~rE5 K^ *{tv}٤}a1E-E~;)K'ybm}lKqy9kVZ nhp۞!|BQVU#6(% Zn vf#j(j-Yw,7REzH! an{{p$ lҷшZy p`1Vk4% JB4jmz6c(2v.FhlDyXXhZ2]XO-UԑH9VVP1_9bRY.e&8^Xj1[(;/ٸ%NxSKZyD^ZMia2QH(s d u11$?\1ֻLŌћ_}YQHa SIh뻟LrYC$9?>E.3LvL6/?MꋫNoog/~GEng~{a4%$72gYy}㪆E7xMA#㑼lyOy+*}<(J>OL #l{{H!^Yh5;25E bɄ-@모H"FEnEi 2V~vK[y*ꢒ _ںXxJʋ<5*rX,uON Z=Np۽ʒa媃a߷ BD^;7jKQS>DۚwuAG/5 `fO h[,%*aM!xVZz^YMmJHKbGQr > Try%Xo2UQ4Yl|ŶT/r hd wʺZ>uYX 'XlCSFغ +:rD(fؙoOms>B2 1ʹ@!@uPy^rgyk 7}nbh  v#gowڻ2~ZB<9|Aܠ2];xYQ |i &0/0DDLQ1x1B2+7;/k+M](L]{_{Wz"^WI_ C8En+W eGgaQ1;ih*cW;`1c0S[Ńi u%,8o*)"$_jLeɽ1 %)D躒 @&V׌K1) 󠮬L;sENcKa ]  b/AqK.VbfHsu19=↭8Bh{b8RyЮZ / iӫ4HJ$=Y) Qldd pl81DŽSOuòƺU[q=IsO-COwȐ<ޛq$I^V֖kv]kӎ蟮ģԶK9p[b\ߘOipJڲms3Hܱ':lQq;:!'{~@G34:|"ÆA1$Cpk G!܀V^p ۉmE0چpTF'YH& PܝJ o&i 䧈a!V〶8d a=7(=xw| k6ޞInɰ}{H*m8cFzZ;A+ D!xZvATJI AcOpåtvVxPH)[h~4Q1D)GyBn(>tX9>&|qB/o2qns}snFW aVUPC单Bj*"PU(UU+S)/XE;9/p[SMy@׫ɲ&4פ^Mbz8*¤❓E Ӆ'R0 \|}?,LxԫɿЅ)rQu~.oW?[LÍJ΅Ty5r*q  /pUac>Z&|N>D 'bf+Z?E!9`f;RRpYosie)du:җ(G]B5S%R#Es5t宝mw|t-put{>7Wet/~Jq/~qX==qSeVkz7wϧ#w??~~&Rh^ CÏbcO/Na=Nqu}lvmI=wh35 Y8<WOg2=acc;|^!|Ɗm/ː e8D-!oǼA !$tU'H=e=I _LAr19Fho!5_2#ZͺȬlcɹ C깻]'U/p_"$2|^'q8]ף;JϴsM/`呫=Fre=: чc!'YTp5.e,o YfymY ie<SΔZ[ ϴN!17!V6%),ޮ{F<|9GzwT x Δ2:NZ* XWFƽyZr(oc❮WhaY^2fpPD8x)0BÁMfV>] U%m GXZiSFء<&v6B>|֩sFkxa'T;_Nah2wPL)5+̖D\S!>/o^b{89|4B^]u (LfZ>3Is Vt&I2 -(QX­_|훩kK|QmpRQ(֎2|::Lڻ6YT^8waU]/:kE9='S"ElV|Nuٝd?ߔݘZҚa%>uӕ($P[Aeq }jo{臨c/WHyRZ r yRډ kj'{a)Zȭ+-, ^T*YJF+`[Lt#Myja`p4-.o3q;+RIoaRl z64;i|*Mz F XN4n崵#=ӮV%R LWB#ǥ@07S^⯧SrJoUW\;}j&])<#騪ɂ&pu]j\*+_R{ >>*|V=ABΘ`Ff4TN2A)J!Iy'*xw*=TUgmbc]2⟝eA+0 ˪Q/K"GإOG 0 #T34S˪#aJĶ%otYw۠Jsb-Ɯ9I!SRoH 0Rfi|.O)F0K|SjNgiT8Ke#zYRk9ң2q) ^=V6&#C,񮝰m_s:ր6D\GThWe-J Oakz'Iy"q R(t[_>;w 5qj^7LJ1F 7;Z)*`tG7f\]Q#sϗQi8x!8x1X!&=Z#Kqp;DG~]JFcBwY 'BIOЦA|wVL/]8&|BFpr[6;QYNK< T x:#l5eb`!,kIp߲nd]_15ʀGlXtW&;(1)l+B:hoKy5#9߶bsm'B!36˴]&f2MMp*s GV My;yVV]p?h~Kt;IptLI&/QEx ӊWoU_$}6JZvLsL 0bt§/n(ɨO`CpQtO9K-wu=v8؇{-QDا/0.fyR*vi'v2e8շ[^n\EP)b;&#ְx% 5Mu)2kPXZj0~Ђjhe54V') SUzTfeajU_ƾQaI7UYXԾ }9bV R?+ s{ecT]q nѬQhjzl}z + sc(. sMp EåaŘ4LYxH<(/lSIp5YN3YyKT .01AMZ-4/SzϪKY4\M9ZO{ Z2oqѣ38"~r'*spD|uW.},+SV׻.Rtp z0# J333&1x $b|/jfS+;{0urO3|I<{6EChbcىacșCzۚEk|^R1kSُ"̊_hY~{y]xiOd=}XEmbS@hT"(0W   ZlKXɶĜ[6 h\O7cunh/WX=C+} hY`Z/{_Q=Ur7g?Ƽ98yiKjsmz_]ҷKwI.ۡKyCsNQH1ֆ}MɤRCE1B &7n7tR&h":xd>7F}~Cr_&CUXsyP 4-;ĄΧ]grԵkb$ɣmlgimk:]>j$kws֩)TeiZ ui ǥeigeY7 <%ZYuDeei}a*9޳x6JY]k Mg} @v\Qu+cNu^գ\8TW7M{H'|cIG< )dӓ=7wEtCp"zYMQ|7OA7I8uEnvJ7ڃ?ȩSY. NPþtREnԝ7UĂ 2K*Jro7v*D3sx:cm.gh]t[6e @I$vdc6tX&`rR *{wFVm8]@#hl} ahƵU U Q2%Ƕ Xh4fټ}*/_~:>yLwǢNH' T]ʱ!ܕ5GXVl<8133Jllp`aߺ(A ["Kn#dC$Bq,@l@GB(;&lRy]9l\pګΉ mB S4wZcm Ժ[}u}6uwZgpϨ9Ȝzʸ=9"$rPOs$MeJ G VC_sҦ2j܆قyl"_Tǵ+jxDGuMeh^ӡp7kňE/f'ċZn9PG 12@5wU.=.CQ IUYVC}9u#a}z]rӍ~q@wåk1&_T!J6/Bn/mJE%NLki&rXKCW yZ`0~}  fUtK:N8odS9@x40;ЇT?iwM%]?IsˍZE0'N,=_NԠD{-!zrgB`R~@9/:~m8YO?Ǚ}xҷ"RG5{ޯNҘvkpXs{e,Rgh^kElݼ8?W2%s |E/Vtɜ7~`[~ /RS `;̛.%{{++@'x:.Tْ"Pd&;M 8[{&Hi&Hwi(egf׌ 6nZ@㗪sciaxyFXjS5ajo1O5Xm/f -aVy^77utᬀKo1|۟^BM'p; <1Jf܁>OVMP~rAq x-9=3Ύi\fێZY]408n&ۭy1DK/,ŐgXvP{'6nW/Z'+2*Z tϮӿ0e6ڌ9#aӁO6SCۂ { N^bSə]c}|V9Ϛ:ozhPHrbP7ufhyo)q`goIYޕ6<4lyKq/K+ʒ#x;Zw>{e <',w{tB *x~5z{)$+ц}m!׉w5K?w9>?uquDnx?"3Dw֡oõχ{mԐO??O ܴ.:x-6f-']x~xV7~ۏm|Ȅ 84 gwN+RcjD1bkؔv1/q]| &c|ܔJ)+<: ;ԇ6ύ0doEn5abj9&t!/9RDƉחY rMb/ R"VG!w,uaĽ5mRkM+|#l-&5^#<-u)6H K4aEqql\DVMbaRysΞNӤcRogiCe39(=5jY;qG?y׊ѓ0F@MԹs|=S1־ĂyYZ#`lJϸl<3z*! ܘX|i)**uMhר_!Ma}0ӗٸxO\nL/WmA|$3Q|Iȵ?^\ ,7w.m$7E'E` [d͗ jkcˎ$Ov?%Km["f[`f,z"Y|j`.kyݓy,ZF$6Z`j1JۈF0XXԢzK{Tއ,Rp|-5j}i9,&Ԋmd}'|S2eץvCV]:B[(ߺԚ'F2պTR9.5Gա"9&7hVNlT']*;]zp,Wk!bZwQR?"nNDŽ{@KrD'0+J$IT8ҥ;(H,&ҪҮi|MQ,|a՞du@ܻѥ77k r_.2BCҧcB_ i?,y|z- `iZo7{3>yRXwOr ^}77CfLl, oNtLed djyCird@_m.KN߬6lC2wna/'hſI,gLm).Ķ{؏w"a*kth"|wѣƣ,m[eoAgmafo"*f:4Z&~\οv]XvN8EugR>Uy-<~ެjڼjkbH2LtJN'rV^_邡K{T=y{?y "^kR'cL0 v{FTܦs$͒P ˚6>:(_RH7G㲲JP`~yFC䴄;htG#~lt.a]sLn@iRs%b ]]22px1d0=8c>v.r)g$ |XՌ)$@KrejT!TGLQ `iYנ$#řScBDD"I5\ۼ/u85!%)DfvT$K0"<UbYP'tYSU\af KR.Jm?VX\.Z!*& \ D8㌠ k)W;@Dy`Uuc&VԸ_xPP;f "yҬi{mS!W-1ng߷>f|yrXl*D&SFSkTlg$?3;$Ny98vl2,}@Nnwkp^pjx@1HGg']wtjuNHw Z'sZzsѰn-; fTzЅZfyR(-?n>.MW!qqˈSMd$51E4IYBf)<Lz1~?? Pl,upEqRFߞ_m'Cty#F:1DHU̍ZsMR`Sv4Xҡvԡby:(Ib.*Sco̞%L8"$M91[-\9u!3h;/ >e~*@!Ks% Fd)KƉ'~=Ҍ\nyi!*D\m4QUm//sWrgQ.5>|n `6,8n8U=m8<;bRQ !c XAMVP?i}ZҪLJUrCtA[9`:p?l%\CW @zN2׍uäS(; F866!%brAGB{Gtեni{9V*:C Ȱ%2W\:X_9zp}֥-0c{GnH @tڌ{]Tǂ=^+bp cXlv=2Fa c_EQXwR'/Nh"+n=:gD<$ &>#<]t1)8jQ48ޝ95|ה"]ī$bjIMt,NXWd|H$;U'*}`رeG~9T p,br[xQNP״qAFò7ϑȡWv{O8{P!Lt9j9z(D1s{A-t劶] 7j:b 7R(hCFy.M;5;} iN#3 2eؑNK9M,Q6Cd֝fXpoS}mȤ>Q۵ndqZgAy]ٶZjme޾!Y#s$,A,Js $ )o!rYJ &I䯶??|"}\/R"yЧM}ZM%ɸtݺhsJsƛTR.s˛.>Ѕ+T.wxV(꛿0쬞OZy¶y ͉wi$d)D_fOtME Ȓ?t퍔 f6}ȧCUU0^ˋvxV@ScO23K'LL"TGHI&)U>Y歰VOw-2}@S‚O%/q&9Uѹv^oؚE Z5|;m~Zlk0@B+& p_̨Xf9@3>EEk[`PªeB.k3K&64v_,MMnL4ML3+g鹝fZ/C3/;|`b:sN^*7΅/{7eXɭ2U@eƨf7bj)P()tͿ%.+MS[zEM#S/Zspsw&<;bA>쇨3:d̓_9*>ȥ;%(%>)jGsC?=#g3lĽLj u%_7ԛjCor*1d=fUi%upKc YCCˊ6آЃnzˌ?b/Hk|8яU̚&QXj?liJ]!)X`ă}@Ah{OC*>21j=j1U45w1nȄ/_3_qdaf-yhܳ|}KPnNZn!6EA~~n:p16|ۀq'ѵw ϠvnmX7 *m*5݀Rn:p26p3yЎ:{,䃛hcRju*.<=NF= LzַWߢ<6}iWN;۶:̭fBpHxWi1_ZCy{DŽ>V(J۬qJZZ6=wC {!d; #$ͻ]=34$ Pt pPZ鄄R\ihץA!<}0 +U[͏P^ ͽ=M(OSsc9IaIS+f QJiMj,=sHҋy8q·Pyo"eNTdſeXvER~&M+(h -pONL@Ǽg䇉TwDtt8DFC//1v+.fup2$)!e+EZ$b92Q$c!HQ` &"WSqC2UIwR[)-Lߒ_QD}vѨ.8d)SDMPЙG"&UQ&sg1({jFQ‰͚G>Q4#2V(0Z' 3b,|9frsWjtUjưkȩwީyqZfx6{Cҍ=%DXi H+Tas10kԙ4ԓ4[ɪteL6i(u\n.ouDPms/;DJ6U.c$ ;e鼥s%?Nm -ICx=3=Q%z<DЄ!'g?,O%e4ڙr0owD烟 5 X:3hDp's-u's~*\GJ:\O 6?f|F\jjȓ|(\ju ;IWS"D7sˑ5. j,5GTT~^JmJ\,Ba\ΫLyikƕȑq &bSͽ[3+&(n:p26`їT[pwkB>ٔ䣏8_{7G mǻ \޼[uGֆ|pmlSܶ幼-911P [^fYZxY|qLCa\F)yͼc X] q0OHJO=BҼ !A%uDs~ 0N[yv>i YGۘ/C%)4Fzʓ$%ԩi~M@THNEx&uT3'jxpIi)m$ly׫IBq@xX ?Tb1'ው rx9.":f;/L4 鎻fnЋtwy3yǵ;vGF,ϩBG8gHD9F sAcAgJ,F647_Egh \Aw#''o_< և$L2¢DŽF(xqN=ǝ6I^Q2_݋Y80L_*spF I%)Aݫ$PQ@!A$ba]XuyY?{=}OfumA4y+g(ؙMjEw[AizӡA{bUVizrHwr/UV;?NѲTt>A)m<.X[ nca|'V)H]Np{wXܦ4bX\Mkvc;G,Gl62ƒedg}Ua}1Or9[ِAЇ;?w^ku0!7zfD{a繲:HҐUu}arsdx#Ž@)r%@D(nLJy"8 fdܝ;uSYnwFp l)~´t"Rp D^T %^z3|  ]tb7rz'(ƞmڙv|@j2ל=wMp쉲k63(vԥ3&=k?\Hٱ׵뉪ٴnîÞ5 NUanH<8^ٽFJ p"0G>^2zWx)1:1Iuǩީ{Yo+catnOv6Un7|_l)[k3mv{7M>G3cRB<<'J#F#84 I$=A%R9#^&K D)d00"#y g*')n,eA \47:RRىU͟=^asZ6T7V6Rb w{51O_t]]LaYџzGQQ4 ۺiB$BPriC\T r!!i!cX,9 MqNO;HE2\h2!}f܂bA( 1{xO94z*S`YreO'"Zgb5! F4~*+5^91FhOZBfL+J=o럍 ' ~V&}j@[BJ6VU>=8HG+Q̑Zj, EZ4 ;c$ IeKOXz۴ ]&@$}t}E q 0v0 qEs+ȟl!X | S6@Y}O{)K~!C{3~ǿri[c2 K~[Qظ́DUa~)%7ymӡ\TέVoWXŦNokBXFuf3]Lc4oW}UAl{wzaG6%ZoF?XNqEzQJllK"XU=x@zOfSёq)BFSG7[,>[FHt\8edMgtC޸Fydtu"nETa}%otC޸|f $>Z_q9w03yfZ~.~vSGoQ^|~{Vn_ݶ;751-^yO(&]QV W{ȋv[w:Z?OhSx.HuDɻM^icwmaenf|ww?|'whM=UM9|pڹ6O?L+?]/7YI_GՍn@*n`vyǓ⚩qM%\ dXbMmi m#=歱?lSF9phsbĆ*]8AU"۪\ZǕl Gk\墇a}Ԙ<0!'ș fA*0JL@Gd+ΧԶ?^l6s.;c8|$Qh8r@}6NVPs_ǡq9:]qnzk>H>v"e#/ѣC9PgH3T2Ԣ) 5Gbp P:ET*DT\ARk 7J"M hIЦڱRQVި;SN_+.>J/-. IVDbN-?1Meo H+qҴ6TFB%I J*ė~/.AIUrx;K׆Q3NWpF9MKW7Wp-v.BCc41F IQ6J32SЪ&yu]c J7ދ2^YHVm^45{}ɚ^| ^H3H]]#uY@K { `I8K ${AQiͅU?feǧj+XTB@DM1M3.ۯ$x4LR@44'3+sa9F{n).@9Gko9)N;eq21!B 5 LS 9*`2q3R[{69DخWD" rĕR9怤Ij,(F a=Ϲ\^gX#'h",-r- -$+h]{O3D803٬5$C\Is;K$,Se VFhc$v+6i}IL5Jzq+IydbF-=Gҩ`"$ `OW>nۄ;QRRe~[!Gq>nQes|)Lnk'<}ύ%Fp{ Cf+ % :}(V@zV@NMV1G%o$ mH3dH8[!j}rs&\*B(Ziܪ4]^ 1]<#vl3zG9ڻ~Vၴwb"mڻӃxڌNwŘ`4;h'@Yu^?DM54v/? $m~RyՉagD :z |^l |؊pf ׬\h,^xr2\,(hr*f{m<XJ&rVK[&}jb1Ŋ-#D1r*`mt[]4§ h”Nl!Ɍ[3օq)!GqE7 (Dc1r2Grѭy.Z)ݷO-ewf.~sNVW]O@ (]ewŞzx+:|L Wg!5;|(ȼz/U V K$a`joLX=9 [T{nĀLX}1rx[o68l?߬LKm}8eo4>VjESٗy}hރž}ڑGg_|_?^!cP98~ߟ>  w(k(Z?jb?hgu4ح׮W RR\+jRZ2^R?/t%֏I+$@D(n#I<3I2hJhlOȸ5bw74:52i^d6aMZ.=Mb>f9[|uB_Ku~GTx\R?[E`?YͰ<2(p `e)ulzNfM2;-fwM3~Δbω a?Nb%ҕWJ`9/`n(c̐3z|KOM@}QL`(ۗIv\3ާJ/Nv@AA[P|z+E7Jx=+(QщDpߌv09==c9vK:Vv>eML}CatMp <Xs{`pA`#38]Q,_[1!+t Gsbca]xPq(U Ď4_gƤ&WLjN BBeLcȠD ORp'cz$rfZ,i_4YזkW92$kAHvoo4n5$ nDS<H*w"~*.) ܱP'UUjE׈wFK.󇻃fEW[W*rsԥ)MG"%IiۑrL fQ/*Kѵsw |g& L>36̦nm[>۽1wpK1F+HI5~*'wôK):dCG|آmo7.EIf\:CT1r&Q4jh_S1s(z צ&Iyqff3faӁ0BdOUccGf6c>=埒r[Ld#7ݗxgsӺB"?'ìW7;ZW)gm&)]D&PaGNtB z `0T袢*65?s ʢ`RQȄZijdQD(f F14Ӊ`ЅxG#\ p-Q2}FYBQmb,%7 Jg$"JHB i2>llcKGKq9<ι4ԏqﺢD^ s}3`B\"9?O_#DZ}M/[W13?_y1ʦr ͧǜmw/|yt{D+.5z1Ek*ÉDE؄2`?~rzR ;Ւ.F:rЏ}R9d uN2PejgL RJ9^X!R#㼭 ?+e9aKXVʄR#crҳR9|BKV=g|>Jȵ)h|a_H`mY)ץZJ)\jR?RoBy//F+=k+eJh@T/)?+uR3Z VzV*JXtI͊;_'7w#GUjo;n_|&Sj[=pǍsZB.To]2GkQ6n5tb+d7㇇?%xMڹaJYhj[r4|0~W볷ZKք쉙ɛyyzUKb%N2=wdY8[ݷ:jV]ቼ$4 edvH՘0c.5<#$Z\EOb!;: X4o>HCzӹOP>70e p@un^y[U8ֿB+[(L[[CZJ{44-@AS0T$[hƇCX%n`FaŦ\p|(Mx7o8+lV`pJdhzatv'*aV "e*86Ɋ9lS2Xe5J*6&ֳ? +  X6MG?ln_ywuBfo}Vi4{:C $SLqtS3|:F҈\ :$"$D@q,V*IjSJ5㔼,^EH HXYPc&#)#*<֐H,cMdTf2AQQH|'y黣S!2Iyo|炊O~QυTR bՏ=(тq/'=IװTVzHY[)#^$].}$5ʇJ?+Wź`Z)~VꤶVy[)lKi/2XM ںBB\'*吡 %)BFj['VYF(VdR$VYS.BVs5M.sg>([_zumL+F_SfRsEWh]q6lՔԨp,Z7RSD4G8ڧHUxutOIs # BW(tK4BG,DAPWrXρuAA/Uuc*o+]zʅiW'%;5qjxAiUXƩCCxn:8V"*-OC!m`kA{!>q>*Q`SxF`é`Co@|!I+A3AIz7(I`Krjs" ;be} ݴZ?kT_X߾r לH!8qkGg*(Gx\jχlTK\5[lL_)k[㢪Fe)*4o2LoCK!ZP6N&#۞BƳif̌cIc;f+-QjN7ZEG{p]gPܿVıP/=Da8~UBeӜ'M=G[j=Q#|'bCy__DIt-:R'(&x$v9տ=׿B.cBd̙c1hiA2ǥ? \!D-⎝ ?clm6zCW&wG*P-9^y+G3YSU R N1LdK 480c01kš b(Kgt \8%F'o|l'<),H j!hLAe0c),Tgq1ȸ2.\Dt e y ur˽Q-W)I(CN;i2%"XHDg,N5, vWW&go^*.T,H$m0o'̳m7gWk7[^QL4;XQPק:`VuH߬j7$z⟟{0x6wHXeeҌ_Ojպ of eåS\W[qS\"@[5揬&qX**{*O$ k?=X]*:vjuN3۾wO1m{PσO *?8 $䟯섰*/+*o+z?3ѝ5LJkuQ:-&"Hd,z$,uֹ^]\~旳i|~˥..!D T1(6a(1n{.?/t)saA@+B˦XG ZwF çvx\}\i-.ba‰]oyc ls`k%AA1'+38R0NL_<}s{Gu@7켷nؤ&[ȍrj)<Ž; =x p RbK6UQZ5;8'ySQ$t{os:yCUqci&J#Q}ܤ{;R1?&J -Br+Ȼ}ac> LB!D8eI3͉SθH"a$I)G2"1|gAn7 ^ؾα}!E.3!h*`fTR#bEH$B0Dtd$8x9Y_mmH{POufFiF>VfYD"`q T).Y(HH4FqDq@+0,x60 u8ujL,FfH&tJ(O"0bX KiUgD殶Q L+5r|G2fǍЏ̿׭Gjqybr|ӣ`ϯ*_kd&cӼ}rL踸Z?_ya|\#q6HͷGW?8Qʂh-?TA|bW\(E_+ǻ;,ױ]ϳH pvG|13Z&_?z?I;<00ܝAb'Alw Յ ciaawp j#)¯Fǻ&-~oI"H֧UAz<{\ HBPv4 \]"ovi&/1 /ꯁVֳ2%7+ݘe%Z y&Z˦8~~(Ѻޭ|L;xV_Sﶼ[P4OB޸ֲ*+Ηޭ8wt꾣wn0-@4k{zMdSӦ+3HP.w|QpOTNaxTBIbBYIUi2RIxl1FheW$1'iYjO7D14PhnүѲpX[+pI(N2d Aw54h&k j / Qp$_.|9]Ev]I!ʲkH61<@)\BtJN@gJ]].Zg29%\+ uWhH,D˘XqW<@6J #.SKU)zs >/9ڪ,PRP8;Dy7輏A(3D4 b#e8W,'r ϤΦ~Jr $fR" -99j tiyLSL; u[LJuce2`Sellq3xn-?˸lK0{7|V@iyzX a&@_ q j&&td0?d<S9]FMsޤ6inu2͍7&0VYEU$&HqdSR#!)^;7Bv!UQvZRV aϖ:Q_/ZT=GTMr`iD1t=W/c~.VO~?BUfhr f9n*ɤ םk40]C,i@ѷf&KD:^iÄaTV!D\JpaH&nK'1n2W<D L+X"B4I&M 6JJLQǰD,URkUXFOq "S"Mb%8NLlc1BU:cP!葶T ylxnI(Vc9G@hؕQB٘c3!8 FH !Z2-F $9c^ S>rCB벧P[QK7Zdiz ɞ6ܿ szN<@b]sd2qD%LiU|a5JgHGj'`z3A~Hm^ec= 1hK,VuXbJ#ĴkӲMJ0> F&Jw>tƘ[m/RW3:@m&JDM:0́,]Uy0;?@(K{`$V9`K=+P1z.,1X,Oy{:Zyvq |2Y,FqهhLu]to8JdOEڋtiutm܏nR$WXio١1*>r]S̻tR\$p58N#K,`G*u2bͱ7L ulȊwxxt*n2՚`G&0??o qeKG.JTu[862~ 4`Xit.&P%niܽ4͈)9@}/o 洖Z?|/}sZ8-MIit9U9 grZ([4g/5`ulj9Du#_Nw O`iqJvZ؅F`Ԍ)wijO;{ZvPNDҺӦ `Ĺ YBѦA)IRr6{,<$fnB__?uZ^J=S(:] l&빰]zav,l7$0p ra(eф&XJHXXe9$IKYIߞw9z9s0/|ߧY!5ږkQ$5J~N>v/BRrF@@bY+lt縟fΦ&oMUW4` RA|N+Z1-rҔa⨲;VbF cRX0c;5*(*E}7w˰ ail*6Ƅkks98z卿n0SߚggZ?LWl2r׽񌨷.&<1d47cy[( zƽ$0 L_=>kyYJ!iy@l*cD{My|ز[Ry Ό_ ?6T^O'n\n䒂G;⁆RwXۭ_aRB62ڭ7Uw(V__7pT+=5a+eg]V4^n zLZ2mCyY(1\<.PwnQin-. 36Jw)oSGif^pPfʃ@u <"&XtxtWu',Ao'/q>N'U;-[jf@m>cX=(ڈ*¾P'd΋sJ%9J [+( #DƂQ;jM+( dÁL79) 5ʔqdlj$"'BZ2ɦTSL>i)p*(M\܆lHKYʦ,e K4p)*I'GGYʦYf1Y*t45Yf@b^.~Ś,VAxfqѓ(:ś,^Ax6.ZQV1G^.<;]$]60E$@[u0ӎF}2)m.ȳ॒sE^u? Qc~`Ǝ97~qB?N&&?(EqYeך$rΜjJC{llj9tb#dwf"4v(BD*0؉6G42q5rH֬"p(g]VR>73*CI8fPGZ/ ?677Wl?ᝋߣj޺ֳs3HuYpt3;Xp_&~ ;+^쯩h=hodEϦ$ Dn#^> 7ĕAG" EnO?6.(wCީ >m;c8xw Nk ep i֙zvΟȾw@3ݾt~?_|;/&_t9Dz|Uzл餦\Iri5+5Ut{o_~:ov~_d9_ }}f|SRw&Q4At]L2^(;}aM5 [xC$fӿS $.xdϢ-ZW#w3{%B4&_Ť-poj{FuKlff/+jjc߻ę]vꓗ"*D]%o^uS$w7wyMA1 ⭻lo\1{ͰMn:.s$?2afdTןm>Vߌ3^9[4`ȓ ~a 79' y7۟{ۚ^fs2/t40LrL49GP9yٷ-¥f|l2ϣp|9Ӆk>g׿O` @_oiK+߇fFo] q`Ane!&е eRfz:I(V7@w BmҠB=:]va_=G)nYao(=럜Imfogkc /oǢ c_V;:KПLbR?`咣]8g;E fV;l泘(n,߷y|i ifprSHXp]F)E4>Ղ7Ã\S8$ 6DX B_"԰Dp댈MLt\۔ Li)[ii5>ԶՀOб_ʹF(Aºo08E GRG8Ja1!)3:M*j[Ѫ`Rї5 I ͡M!&@R)OU#QN5,4E!)\"2 k0@i3Jƽ]Aw Đ9H}'Xi.J8ƤD1SB@k9 B)ʄHDRCI>0BPS:vSۊ ! rs@KDTxQ'QHƫഭA1Bȱ'\ ˵ɶB2 1%hBOD0Ps)R^x6R~G_} 䖏<@~"H3:H $0/&J#dXU j",zMQ fVM@ЫXM!ߚnad]Kn[嫗9t11y?_K{Е0T?[Ͻp>+_~ޕ5Ǎ#R*>^[ѱkxY; K*mU= uP,V@U/K"D~D"lM/e湿|s\8.hL;RE-C_ׅsNM'vΧt8<_<:<z1w<_!_>󝴤*z/ZԯYw ۟qx!al{D@kז/'pkNFCr@CCCSG-#Y>[VR\{zP:lg|AZ"\佻Ob=t BX({T*HL$)ٛvYŸ iB͠N"pMjz5X:cÒzRRc';Ro.8~w7=XrvIʏoU|nW%4ZPI3YS- tk z9`Do#SmGf~T/oR*VDzڌraYT|3{kjrpM/{g|W~~Slm$Ev^/dz^濲& VuJqUYUw z`oWkv˷xPW%(+$䙋hLQ GSk7YPGtbnY_cPjXM{j6$䙋hLa>z yOaZ4v偏ĎT>F0aŰVh;d:Q?XwO򟈕׽}t>2t8RtѕAl֫BC}` ދ5Q#d-#X0 mTD@`PY ЂpQ# ًEB"> DIRf9'$3&:05ˆ,5@ekd!o|c9-xZB`޳"@ ^;e$!(Hs#&a$Z@l!&d3 Rp9EE .H6 .>%Xe (ÕF$I$:AҪzC%5H2`R3.is& $$߲.+T/I9PXǎj2b1E%PxK3Ԥ y"%S~n1hXN!m8capo-zڭ y"#S[{Qbڐ[$61lNSb'm !\D+2=e;∵&wQs,L „=keCϥV6V6 <ʻ#Z2\bqJ/rntoOM,ȓ_N6M! 86qכ=/ttneh}{dY ;^ԬZmhbƦ<xJfSby 5{K'JTTN(WN碜P? ('vb\`;PC7M1 Uwj#yȉ޺_j~϶; ML^;B$2(l-\ `F*RɱL1I D a_0\TԞQ R䫋qT6bIJ͍YMW*бZMLFQbVQϖ0!Vӷq^~~X^N~H-+}5+Faû*dzbR< [ D5Fu(.NÞV£'/Ɲ zWs 䈢 (ø8ŜTPk\Q(TiFY 00%Ȏxyt$##d2ɘrt>Bvv ÂFnhغEL!0~kg0 J2 D <"%U Mj9{$u&p?f׫e|H60\vmDKԧ #ٞ*XxKw5hu (vn'D:sZC$N`QÔK~.قX1m@C?ɓ?tG%8wCHӄr3eLjw"<)g)M2h*5$2AٚqLMkJxf0iJJ24 |(8J{Ɨ%4e@ ˬǡ%s3 UǶ0]51CTNs~+<38cY8}C+\ V愘OCraoXS|hz`Mo=^rFؘ xY+JFSɋ7ٝOz7~ԺƆ=ջ E 1Dm^ I c4߹O.dR"%=3d^-@qg=;:6 Y?"R~\:`iKg0xCg5\>vc;T?#!H($ Fb:I`N-YW9&SJV,@԰J4QTj͎S0 fM3cҚS-bP7j:7-+y91-H $FRm "HwEJEfʒL)R!Hd{<`D o.8J"Tf @(J1d[itf4Iƥq1LQҴQjB56O,|gcn3_xk3 y"%S޲n EmHx>#6`֢ÔW4<-H3(*(cn<1BۀgnD-zڭ y"%SM]r=*_O ~=^#j>#=Q-H32%~*|,Dׅ:W NEd2`s*r3S"-AL&w$S'ѳ{ | ]Y!1D(Dl&-R`$y&X c r\R@s9_|ыxtg*>dtQ=(!lE>CQ5`zO~ykLHck3`;Gz-g_ #p:),#b U5!^yY@wW5`ʆm=P]:6e=+짨 RfJ>㣐cT_ViK)~RJH8 )=F\>a)l#97cS}YZR!?)EiE)OJ)I+O%wRP'„'SY2F)#P" )e0!XjiŗT%g V4J *I:X$4h}`#wʊ苕9]OUN9 <`63ɉr[\ <+'ܰ3eY)ѷ_(ndNzG[MwԅyTJ]Pb(:~+S}Y0_76^oPz ryEkS5O9ꌧ[eQ!sXoTs4IT E2ֻ=|v*=YF()U~h5[ }A?tny0 `|׾­-0bg&8e]SwǖyytI1|sݻ|t֓Je D-V#D~L_?6N\[Rv`h! ;BQ-σzw7}8"Ux9<K);3K o-옗Q뺎w}W3 hN^CHZhm@@G4zw:{,Zזv2(B*z @M=bS;=Ġ=|K)D$11Nq*Tƙa@Y&2J%JS 1nOWA\rW>orrv$]@|OʉVNbF*rh.7us5a]/^/\c>Nejy~G)tƙAOoεbXSt@iܣ]? ̊geUQB*QJWv1hpշzElj/Wwy&QK)XBZM300JT 'h5,%,OR6аl:GRN)@RJ-r6uע at| 5V j zJ$SR5e:%BR3A]%OCIhp\5;01iS|,ŷ2*Tv~ˍ-g386f3ƃ˕1lP،n¢6m]i67GL QY07:]sb&>N^%Fm~ g#Ey³Y\ȴQ7i6OEjb$T0ݛ~ڱ_!_2ة)BkЋk]7-7V5䱴ɎH&+%C$@biNHө25vdaL0dո{.>\{yfYO E-W>/6ty'iVB@X ƲbyYG־nbW^ # Z p!ϋ/y Q,?%!xd9DZ~ζ}?NkϹՙ6^N~L߭M~~Ľ[B&$ fT^p\~j]7h+ɵڤBmHtq:gRDkfKERjK j%D~\h˅aDX k6.m)ia˜"H;cFs2Uq}?9g]퓻%Ye=6YOObjM˯̞9(fν%|ϧUsE [{{ ކ7o/&z\z {"&8p]oOn_#fRjF-W]?=f SFAx{d r1y~M{Xb߅X}ڱBBp1ӵ~5{k?jf?]ٞFbG1/+T# ӎ]OJ1λ(0 )=FՒ;=m)%2aN26]z*j"g)}RJ=JTڑWjY/3T)=3Mr=ѵw$_yE` ũuB4N*Qtqo]|M8KԊӨePw#\}X$gILm4ANrn#vqrzYA~ DHnoWŌ|:0vwH_bmo8vp 6(-$O2~nnlՒ&ŮU{w/.4}/#M#t]̊f: pe6$hB1O&-J,I=GoөU8]`)UQ^8t; `MAQA{iF_8\YE-cU/ BDOIkTlP-l昹&IPXHzp+*re<ECBXU\Hsq0(T7sCz?c!J_)I\CU%ce4"@(tnB8YtP@ أ qTF0I7*.KP!J@`ȘI φQv*+-źԉv)HFCxi85$RG O႓ms&آ,2s$sBg;jn艉V o|&4}TC!pF{څL7ߡ/.kƺ>[r w3^Cя \'u Ns (4T BȈy7ܱ'A:H=J g44||.By|u)\g/wcXlM CqL+e"JKXy[LR9$T]-i9bQgS*浧hERrPF@,Q^Pm!L8K[7])D!F]2dnZ>8LAJ@J“w]j΅\gtܑE )jR]K-> #hjd4*nTk´Κ'ƒ<Ap\b8-1%qĜ# V?9 +{B]\XS=\D0QC'tq e^b tT1vII ъ#'H@ %SPMVZؕb籆筲A,DM f[YSj7fJXȵ- aeb%Lx&7wxԇ4]KKRm"${rz2=rxi9)@C(Gr, H qN58b#M#ň$rqQ(&dKnTqřl- gIIlRIXnP~W}Zsijwm>߿$hDdCrμx!çnxḦfD°%} ?/ K3{_#{OXZm?NDqeռ~zw@[ MN%IU:o\}'F򜠺bSNb$|-{gLȳ˯Mb(N_zuM=smN P61N6x rW}y7t{3KCZ%l_+oo?V FX)4[ߝAs͏ylt() Լ/S+o/os 0^Cp:J-=VL{cveP}c1z2xs,Hs0.c?_"ZԺSGDm{h`=7v{6SFBAֺѝ՞Zs$oSfS^w?Y i3BHtm%U]{2o.mp8 ǃOgRlNj:JOQfVcoTxG>,}ʵ\`pf8V9ךZBUʮ[޸WkiNmf>v5;s45d-hmo4?%@KC ]+] S̊kNF<i ex [8o848*u,e"Nڨ9UU2U`QC}?n!FOG;嫯bv1s|$*Ce~`Lat wa|,D {A*l MXlF+؅F"PƜZ.I+rY[i:`2lNG𓶸6_(|HEۧ#^&iR2mXVzH=\IrN֩ ;zY=ݻn'JԨ^t^Ս74bKJqXZ+NUS `[na-'Gb99?Yb[bV?6ߏ:p>c"(d;Ew2dɨN#x5] XD&m> Q^.lU%E/=ʓ!`?n8xt +hv-ԑ K)n +8*+*7kN!M!fN{ƀj 8ZvcQʮ޸^AOG5I  ~!>XËe omBv+8mhyTXoi\, -H vcҏԠo8Re~||dUuXVF}Ă5:]}R(*y߳UĦA %f*nK5paV7R״y>FVް{Ȝ@eރeKޖ।5Pl>]}DD-@ } }L8ОqXCOp .`kJG+m,DpI'os:A.޾n_ n^X4i2IzxW@ׂg; bP[ݳ}^ `+H_lKMͫ{7dk% #ǺJܮB3P>d pXחc5amoQ[֪oA\IR@kבFTb\ן}Y^'8pAo,Ѭ^ x`5sS-f?[_?ՠTmTuQ{/)2g3KPc1 Pkn*r\vߦú>즏_Hȗt2GP_9x+%|8y+L3b}Db.)e]|9_ []Oq W݆EM# h+RWM@wk+i&`gzn/Mn]XD[ٔ87&[[Nw4n"b֌z.,nlJO>Fu%Yc+SAoB/$^ j-oyMkTlPkז^=kjX)Pu|2`ʔvlO^qrw(ɈB -s߅ف$Ph P̩w Qa\~PDh qQ&mF80ڦ ysP6n Z[x :{)ĎNLpw.4.XqK"Z!7D+,/)~ƍ;;sh!nmВrYsq$]ee"&m,Y\Ges ?2`h[FQУIAb' o~CHC '"Y/nLqڐ>,IDrX}s&nl#xrA8Ul¾B&dhg]y(>'Ky6E!QD9"Q$ghǶRVMݶw~cq4EӀpT(NN4Ɣ Tuɏh Xe,!,ZG̢*ΉW_l:8 OxlىܖźPB*dDQ9ʌ HNkΚj_#vKڀ~\ GKeB*߲bqҔ kDv>\P$9%?ͽ N#^y$ǐoT5=aQZm&r-su~= GC+9P\aQ(P B* h s8xC;㼇#wzArkTn3 "lFi6vC:s a݉ @@y#vV{SEȐ LFs䦺 UռBSw$D qU3@HdoGz"(ԓ>)dA4ͮ~Y.k.Hpƞ>x19ZpBtb hÆr@ٱѭG`jh}S? s5N0Ц#H _?\l!WE^8r MUIdMj~c\#6 K8u-Xێ˫vm.p& ʧWT#A}-_zixֲa8*<kJ7d,$E%V#Kz >Bѱ *CɵO _?7-;=٣B\+\EnI%"Vh糫W#oNj% O $!ga g!3vxS xMpQzTTY&0WTn@5&15=.c<#k M6Hx$bܼ };XF{Uy?1L$0P\=yAZw\ 69QIi"e#)5@cs'UPS(SP'[1duK.P4: ũқUw߶/ pt* pT!g+)g:}0Щ1.穝ڷ rZ͉\Zx07u*4\ݾ_ɪSCQb{ٺnnkLjC}FOFqRƐ5m6/nU>e=Kqjd Lrr"Ȃ$fvd}g@.[L26/(NEٰV?_4 iƇRig|2"Kvx?Ӭ+F%dJvD*C;zCi'͵zcŷ8{o[VI˶UPC ˳v>6Ԗrt]]DfYܫ8Dr$A`3 u~XU8KG\h"Otpm!)/:}/.p}جׇm6ّzZ;~蠟%QDuL*e<_ :Em3mRrZ%ER=ߧ&1d+>eX]u jHJ"[?X(N!Lݻ{(\D Z)gӝjQUm5<9y*ޕg{F'kUș{f;α9h*oNfOiƜæ; =)pp{3:)`w7%~RkGֽ$8':} M^NV}a ]kTWq<2i HhXS'UEiq`Ϳ*U,ԶƦ.jOeܬ ><且Fv*2*洎x6>2ؓJԔ{0>Sʛ3TLQ>\ryecEFznZoC['^%apooSfCl #nwّ#;MwWSJ\,>#k-=rgN#SfmOf&ն\Ϙ5%s).V.;KzC4n$ם36߿!p74b'_C׿mX[H?MwQ!>7{-(?5su`]%N7R)VpUprnGH¶'xۈY}}ːCL!Bo-A>r@foTz2-r1FDv7#Gְ>YLl{n5(V%aM-_.emʙ{+9`M45тd|̳|q(3Tߛ 7w Ȁa?i@ßdg[6*͡ /)y&q0II&7u;2/}hEb\BI}QO9]#EqC}E$=za^͟w{ "p_ο rU-T2)qs)O7&G<'ntQ~34}tS"_Y,Ƙy2N?_U$2%(F˜I%IOhD^~dg"X_z>xuKv0Iyۄ*doM!S7L]0| Q$ bi6YlJz zj՘ ΐ=<[RQ/I]~{C!׋DGj$ˎ >C8Mbw >ropa(@=Q>VmiSg9s7~@غjx/etێ{(N}L ?2 x& f1<&\HEm IՓY >_MTL9$qndz_I'8ozÔ3W& 17RI6OE M\B`h(w^d@Ŋ= bg_gq-G*ot3D}X*Jv>Vla}=w'f/;=ܕ|(_U$'%7! j5o SgdG34٣ ;z ¬~\>:FH+jP( S/V뻸ێ|;u W<8c ƓC4zTT <]#|qǣxsJNeO-S˧!{(E=W“*ʊ=|TG?Fʿ^C*_Z>kNig" j*%>T c fѵRESۡ"&89VƋUyZ'GQr#{4!n1Mw\؇@1X%+4~;V|'Y gʚ9r_ai3EȎF8Z;l_'yU7-,RRI$*VHIxe%> @2jz L;NzFy6 ^>tqhʢyf̡1XŌW@[N~ Wf;ןbiEh)P2FlY&L+k\zN)4*}M[0dJp/3Sw5Tk Ѧst:}:قk"yBdyYm < 7 |穭\֚.&^ $пMp)J牌2lVg4W$z):w{ h-OKqJN:07hx0ٌ& DSLtX@!j'[kJA}Lγϑ:h+2LO)Yd7nt13W2q|m?>%%w8^>"xrJhГzw՛-P% 3"[F֩f|ۍw"F*mChrZVlrŇ8C\sq#S1d'H3$6ڈEA 'jf/>ĦLZ"E$Vgy61Ay}0N)KLuvK%}~PF\F9* XP`ԴQn)Zs"snr PZT/fzq2wrj@\IԭL?PdTH_Ah3TiAUƃZY`2sV KɽQ6,+,]#yk({#OQ-߸i/sFҗxڵW54t΍ W܄-KE8sйQ.X0(J~/~_OB<%M P%۠;-w[Tn٨O16P\IAe49b|nh$R!r3\MAa*bTJQZOCHKS9fP:S(}yTMkx9%0zCT*H0 DL5&|. *]-Du!e~ OQsOpr*m7UM٦ )Ǹ8v`_yu8IWw2= J"ꍔ2W߀ȃGmbiz8/&磇{o%u6sW^Aq#qdBrWrƪ*;XwP/&wVMcE]תԢT](}L-Za*k sLGÉ8RKpΠ%Rœ#34Y&4P`pG̝liD S.j4Hou~Bklv  ͞jPJ9t-p6\f34o֖<nHJMd}癰gVzY˔:2؛fO7DDcH Bf{DGyGۀy:&NNg-,#7sshNl)tqU !@Ӎy&6t6Hna9^N>śY}@#!hLP#̓@[C ^BcJ7Rxvy VR4!z֠~?&nO:D\YcYYݩrK){S Ճ,Uҗ1xROw;it T߹7f7`5;kͺstjf-1q^>AD4dWgo LwG}\0mvuZ2YROǪTܟ5'VSx|1eo.ySx]ʧ?&Fy>8O>Jy|'>OPuo?-=Y˙'^$Ίl~4%򔊻nf6 ɠZE,LT궰d]:&Z 6/CX,NhIB^6]ur;ɺvSΐnM1":MQG:@Nݚ7ڐ.MdJ uvDcnM1":MQGҋԋgڭyCڭ y"zO*GN85е5+d?1ӭ̈cLpMJEim.z_}B5˕#wz'g(-_-!⫌xYx9` '/Ckc%* H-ǀ2Rx%i9&j"@+hmH.^\lu6(/2PL!,brnT&@m;IaQ552ݻ~Ɗ (5Gc>f0f?>χZSY@P㛐AXp⯲YSԜ3K+NףVƋ \K Y8wHr!Z$\MvpSؒO3YbK[,-YKiuXb&ZAՋ0F_WFRa^co_y7VQA_\+fSC k@zmXO>皩O6\,M퓊(B4Ibۊ,:yAl.#.ׯ´mc6d@ϑ0D% wdk&$KYeufme@ٯ?nFG٧btk9Ndx-ƛaPEl9ayЯT([kWkG4hљy"E%+Hr*YcY 6} ᥡBerP<`r 39I HZ3o!(p8Kf[qBSᾹZz.7W7W[+hV_H[Ƹ7$MW3L.]GjeO_!Bdޥ+}TjM.8㚍Ӭ)O?hkNCk8tY-_暇?8IWӢիz2Ujm4/X3#B)"|A"l<("pℤ97Y{&.s!FƩ\ `a6әqz1rjȕ3.(@ZTYdR-NBQ^FCBl y(BB>JcNcLkVԭڀ4˫@HƄ@1ݼzSVl>w 冿N>n,_-PEwh&DR;-sqܗzRpe#b-c>:*|ϟI@J#/"ð{Fm 2f4Tcc^S(Ds  :5jU#$nܸ6B-*%RNX1wsfod*eӖSu9[GRL9$2D@ 𞲛a0rc^@]O<5tC tR`Tѧ ;5ܧ9\KNwh/Ao ΀3r.,1.ȥҖ)0 (s ;bD(cŔPknF2jD}| D_'%x@l\AMC, K;b]S $0@#1GO#87~mƶ2`r=/_Ҝ]Tl+-g8H+q 3s[)_\\/5y9.NӁr܄`Ai$&\s@S$o*m(SfJT[i¥_<*tX-<+,Ԛuai3&qw֋_ݜ{[Uב>0tʖ8 xm8I,OݩZC^Sbz2A!IRztTݫ9tq7i6̈́eO\R=sEB\& 46jgx[H \nUH&0v:td h`4GG\(*R<C@- ldz(\ K}v4cA[QHTi`|VMN1659Pw`]梊z J%%tum55U(TX6 hx 7,dRKBlB!WuMQ6\5 /p<֥ Fe 3?. ҅@ 8$ӉN{";D`|0 jÀɂ Bp ;81q<(`dt'g*rs:@@Ud,wTRbr~%Jȷ| P}-nK=,7p 7#H/?iOfk8{7-m3jzk $\fפ>jTwxy F}w}Y!{7suk( 30)"c4hO bȖ%t1^`g߆ }_l gl ߚ6V -tp'UxT, 5$v?Ui#V-JykL)o6?}hW,woU(Ҵ?KprDIVhT7/ z 8q1~u1w~0+-vsUZB@|C`+ {q^;}rܢŸ-!&ǟGJ2DW4?$ }9.+Ϯ?p~u6ͿGz;8+[ P[_2h QufC;>t,'Ha*۬NZǺ{뇱IoPT-T=Q\K)3h$J YuyaY(mi7IZhbԊ6//?4sA-#eC4L>}*/>GVZcً; 0V^zޭ>ZciءƟ?5;oDZ/q/OŎi2y{2 2'9x%J; 9#NY[cP-l/m1yMyiUk8̞rk8o7l+mI|UL4+*qNLOo4kmZYըHC\E[TNj,wcºŠ}6":TP7ZЦ֭ 9rmS>k݌T`bDu>ciʮ[kAZ.4UtE SMscWh˭怰N>iZTZ/?v1؆%VR4N`CjIhΓ+7\~.;F {esfDzZП.=OmtgUjXDGTΒɜpslfi2A SB}Dj>4N4訽fa^hlk^SZozmT?&WT\/B$ƬM ^Xl?U4ߴw~X7FUrpSk&%jCU%%{%ɓϒ2?m(ѧ%.%u@ ¹,"1?"JƙR•jIɞEKz~*uBT+RhqU HztC R0`tC y0 9 +]zʧ^滙)inf5CXҧYAqɗ:+a喔J^M4OoIno-S^9MoB0S/:I[=$T7?ۈ{™(V̬epɮ8Ky4GF;4fG4Zfh-_JPeh¸d*Cz6\tL 7Z[k 7nWt) 56K`ϨP r7f ! h%ڵb9߮7ɷSEh&DsW )UYfDSRD@-Ke)Smۥ"S,c*ODk!F7sgEsOҵ_-4|[Xj#XKII$7I-ri5 "Fx+]8:#_C;2tB ㊫(?[tܚ[W` OozFj)bvUCj J̮~)H3^h8 @.gA]%[#g8 4q$P*; zJŽ? a5x"*Y^=tV]BeJӞXiLkV8Ju)LCW~]-ww15lݬ~BYEXЦ%irP]*fM'+/2m wE)LQ+% :=Ce]8'H{X/m6]7(;Wy`9cL2UDt9䙟^hK (]tdxys 3WW+/&gbsw/C ˁD>fOrjumS>ڭmFFtGU:JԻh['Dá:յlbV0Hb.XVJ'IvVnS`UD)\sPJTM$&a2ӹ4O9g6].sy5R}k@Jwpmbb$=V!2iRsh JbQ c938,08S4McVj|aUv;ϝPH5H~us?)}Z^7f.no>nH"&ń*zֶV F's%c%N".LkWX'ʙ$"#k#QId2={kzsIe$%ykz4:nD*66Iǘa, uORYbZɅmh$rN&Agr/raMnu]ozݭBOzn^ "n 4ٛ*ϬVu^]oWdW 5e|Z^�NeX-]Qc3Y ,,OOn_Vaduċg z:` ~1OBvR }6s:P-z J5lPDl|0mh% 06 P舦IPjT7y SDc:j,̨N)E?,RR=*"tȩdQD#ోrQW67VVGj3|O ȽKA!RtخvT v]%H10M[.{*m{1f1EK϶Q7drGST3i깄4-T4)|MECj]0Դ_+sVsBfZ݊9ERL֜BL5+zF`uַA4:6icy[Jݝqh4N~Vi۰]"Y6bXU{Wv tƫa?k' ]!ƻX!o*:~o$&ʫ#~險6bQQް.tFt\L #JjSJiK 󗌔 킽BeRϪR[U(=GrNC)/#Ri( R XQJrU AVl-0V9;o "pϩ>㻇3Sև}j~\5"8KBhgI7R[6 gB%Uͩvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005235412615137227457017722 0ustar rootrootJan 30 21:14:25 crc systemd[1]: Starting Kubernetes Kubelet... Jan 30 21:14:25 crc restorecon[4693]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:25 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:14:26 crc restorecon[4693]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:14:26 crc restorecon[4693]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 30 21:14:27 crc kubenswrapper[4914]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 21:14:27 crc kubenswrapper[4914]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 30 21:14:27 crc kubenswrapper[4914]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 21:14:27 crc kubenswrapper[4914]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 21:14:27 crc kubenswrapper[4914]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 30 21:14:27 crc kubenswrapper[4914]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.007354 4914 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014389 4914 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014419 4914 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014430 4914 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014439 4914 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014448 4914 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014456 4914 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014464 4914 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014472 4914 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014480 4914 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014488 4914 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014499 4914 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014508 4914 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014517 4914 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014526 4914 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014534 4914 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014542 4914 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014550 4914 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014558 4914 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014565 4914 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014573 4914 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014580 4914 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014588 4914 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014595 4914 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014603 4914 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014611 4914 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014618 4914 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014626 4914 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014634 4914 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014641 4914 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014649 4914 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014657 4914 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014664 4914 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014673 4914 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014681 4914 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014691 4914 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014701 4914 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014734 4914 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014742 4914 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014752 4914 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014762 4914 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014770 4914 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014778 4914 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014785 4914 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014793 4914 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014801 4914 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014808 4914 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014817 4914 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014825 4914 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014833 4914 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014840 4914 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014848 4914 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014884 4914 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014893 4914 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014900 4914 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014908 4914 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014916 4914 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014923 4914 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014931 4914 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014938 4914 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014946 4914 feature_gate.go:330] unrecognized feature gate: Example Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014954 4914 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014961 4914 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014969 4914 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014979 4914 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014990 4914 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.014998 4914 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.015005 4914 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.015016 4914 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.015026 4914 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.015034 4914 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.015045 4914 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.015936 4914 flags.go:64] FLAG: --address="0.0.0.0" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.015958 4914 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.015974 4914 flags.go:64] FLAG: --anonymous-auth="true" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.015985 4914 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.015996 4914 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016005 4914 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016017 4914 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016028 4914 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016037 4914 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016046 4914 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016055 4914 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016064 4914 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016074 4914 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016082 4914 flags.go:64] FLAG: --cgroup-root="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016091 4914 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016101 4914 flags.go:64] FLAG: --client-ca-file="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016110 4914 flags.go:64] FLAG: --cloud-config="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016118 4914 flags.go:64] FLAG: --cloud-provider="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016127 4914 flags.go:64] FLAG: --cluster-dns="[]" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016138 4914 flags.go:64] FLAG: --cluster-domain="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016146 4914 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016155 4914 flags.go:64] FLAG: --config-dir="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016164 4914 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016173 4914 flags.go:64] FLAG: --container-log-max-files="5" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016184 4914 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016193 4914 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016202 4914 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016212 4914 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016220 4914 flags.go:64] FLAG: --contention-profiling="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016231 4914 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016240 4914 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016250 4914 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016258 4914 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016269 4914 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016279 4914 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016287 4914 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016296 4914 flags.go:64] FLAG: --enable-load-reader="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016305 4914 flags.go:64] FLAG: --enable-server="true" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016314 4914 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016325 4914 flags.go:64] FLAG: --event-burst="100" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016334 4914 flags.go:64] FLAG: --event-qps="50" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016343 4914 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016352 4914 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016361 4914 flags.go:64] FLAG: --eviction-hard="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016371 4914 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016380 4914 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016388 4914 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016397 4914 flags.go:64] FLAG: --eviction-soft="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016406 4914 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016415 4914 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016425 4914 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016434 4914 flags.go:64] FLAG: --experimental-mounter-path="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016443 4914 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016452 4914 flags.go:64] FLAG: --fail-swap-on="true" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016460 4914 flags.go:64] FLAG: --feature-gates="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016471 4914 flags.go:64] FLAG: --file-check-frequency="20s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016480 4914 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016489 4914 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016498 4914 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016507 4914 flags.go:64] FLAG: --healthz-port="10248" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016516 4914 flags.go:64] FLAG: --help="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016526 4914 flags.go:64] FLAG: --hostname-override="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016534 4914 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016544 4914 flags.go:64] FLAG: --http-check-frequency="20s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016554 4914 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016562 4914 flags.go:64] FLAG: --image-credential-provider-config="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016571 4914 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016579 4914 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016588 4914 flags.go:64] FLAG: --image-service-endpoint="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016597 4914 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016605 4914 flags.go:64] FLAG: --kube-api-burst="100" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016614 4914 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016623 4914 flags.go:64] FLAG: --kube-api-qps="50" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016632 4914 flags.go:64] FLAG: --kube-reserved="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016641 4914 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016649 4914 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016658 4914 flags.go:64] FLAG: --kubelet-cgroups="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016667 4914 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016676 4914 flags.go:64] FLAG: --lock-file="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016685 4914 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016695 4914 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016732 4914 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016745 4914 flags.go:64] FLAG: --log-json-split-stream="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016754 4914 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016763 4914 flags.go:64] FLAG: --log-text-split-stream="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016772 4914 flags.go:64] FLAG: --logging-format="text" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016782 4914 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016792 4914 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016801 4914 flags.go:64] FLAG: --manifest-url="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016810 4914 flags.go:64] FLAG: --manifest-url-header="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016821 4914 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016831 4914 flags.go:64] FLAG: --max-open-files="1000000" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016841 4914 flags.go:64] FLAG: --max-pods="110" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016851 4914 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016861 4914 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016870 4914 flags.go:64] FLAG: --memory-manager-policy="None" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016879 4914 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016888 4914 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016897 4914 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016906 4914 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016925 4914 flags.go:64] FLAG: --node-status-max-images="50" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016934 4914 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016943 4914 flags.go:64] FLAG: --oom-score-adj="-999" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016952 4914 flags.go:64] FLAG: --pod-cidr="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016961 4914 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016975 4914 flags.go:64] FLAG: --pod-manifest-path="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016984 4914 flags.go:64] FLAG: --pod-max-pids="-1" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.016993 4914 flags.go:64] FLAG: --pods-per-core="0" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017002 4914 flags.go:64] FLAG: --port="10250" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017011 4914 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017020 4914 flags.go:64] FLAG: --provider-id="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017029 4914 flags.go:64] FLAG: --qos-reserved="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017037 4914 flags.go:64] FLAG: --read-only-port="10255" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017046 4914 flags.go:64] FLAG: --register-node="true" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017055 4914 flags.go:64] FLAG: --register-schedulable="true" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017064 4914 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017079 4914 flags.go:64] FLAG: --registry-burst="10" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017087 4914 flags.go:64] FLAG: --registry-qps="5" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017096 4914 flags.go:64] FLAG: --reserved-cpus="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017105 4914 flags.go:64] FLAG: --reserved-memory="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017116 4914 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017125 4914 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017135 4914 flags.go:64] FLAG: --rotate-certificates="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017144 4914 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017153 4914 flags.go:64] FLAG: --runonce="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017161 4914 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017179 4914 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017189 4914 flags.go:64] FLAG: --seccomp-default="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017198 4914 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017207 4914 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017216 4914 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017225 4914 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017234 4914 flags.go:64] FLAG: --storage-driver-password="root" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017243 4914 flags.go:64] FLAG: --storage-driver-secure="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017251 4914 flags.go:64] FLAG: --storage-driver-table="stats" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017260 4914 flags.go:64] FLAG: --storage-driver-user="root" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017270 4914 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017279 4914 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017288 4914 flags.go:64] FLAG: --system-cgroups="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017296 4914 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017309 4914 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017318 4914 flags.go:64] FLAG: --tls-cert-file="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017327 4914 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017338 4914 flags.go:64] FLAG: --tls-min-version="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017346 4914 flags.go:64] FLAG: --tls-private-key-file="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017356 4914 flags.go:64] FLAG: --topology-manager-policy="none" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017364 4914 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017373 4914 flags.go:64] FLAG: --topology-manager-scope="container" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017383 4914 flags.go:64] FLAG: --v="2" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017393 4914 flags.go:64] FLAG: --version="false" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017404 4914 flags.go:64] FLAG: --vmodule="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017414 4914 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.017424 4914 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017651 4914 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017662 4914 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017671 4914 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017679 4914 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017687 4914 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017701 4914 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017735 4914 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017743 4914 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017751 4914 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017759 4914 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017767 4914 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017775 4914 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017804 4914 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017814 4914 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017823 4914 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017832 4914 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017840 4914 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017848 4914 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017857 4914 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017867 4914 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017878 4914 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017887 4914 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017896 4914 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017904 4914 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017919 4914 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017928 4914 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017936 4914 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017944 4914 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017952 4914 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017960 4914 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017972 4914 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017983 4914 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.017993 4914 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018011 4914 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018029 4914 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018039 4914 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018054 4914 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018073 4914 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018086 4914 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018097 4914 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018108 4914 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018123 4914 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018134 4914 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018144 4914 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018154 4914 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018163 4914 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018174 4914 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018184 4914 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018193 4914 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018203 4914 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018213 4914 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018223 4914 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018232 4914 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018242 4914 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018252 4914 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018261 4914 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018279 4914 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018291 4914 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018301 4914 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018311 4914 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018321 4914 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018331 4914 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018344 4914 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018357 4914 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018367 4914 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018379 4914 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018389 4914 feature_gate.go:330] unrecognized feature gate: Example Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018399 4914 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018408 4914 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018422 4914 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.018432 4914 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.018459 4914 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.030972 4914 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.031020 4914 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031157 4914 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031170 4914 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031179 4914 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031190 4914 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031198 4914 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031207 4914 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031215 4914 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031223 4914 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031231 4914 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031240 4914 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031248 4914 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031256 4914 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031264 4914 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031272 4914 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031281 4914 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031288 4914 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031297 4914 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031305 4914 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031313 4914 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031320 4914 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031328 4914 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031336 4914 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031343 4914 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031351 4914 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031360 4914 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031368 4914 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031375 4914 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031383 4914 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031394 4914 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031404 4914 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031413 4914 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031422 4914 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031432 4914 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031440 4914 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031448 4914 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031458 4914 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031468 4914 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031477 4914 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031487 4914 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031497 4914 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031505 4914 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031515 4914 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031523 4914 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031531 4914 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031539 4914 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031548 4914 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031555 4914 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031563 4914 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031571 4914 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031578 4914 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031586 4914 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031594 4914 feature_gate.go:330] unrecognized feature gate: Example Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031601 4914 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031611 4914 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031621 4914 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031630 4914 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031637 4914 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031645 4914 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031653 4914 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031660 4914 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031668 4914 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031676 4914 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031683 4914 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031692 4914 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031700 4914 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031738 4914 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031749 4914 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031759 4914 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031770 4914 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031778 4914 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.031786 4914 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.031799 4914 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032040 4914 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032054 4914 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032065 4914 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032074 4914 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032083 4914 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032092 4914 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032100 4914 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032109 4914 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032117 4914 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032126 4914 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032134 4914 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032144 4914 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032153 4914 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032161 4914 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032169 4914 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032178 4914 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032185 4914 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032193 4914 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032201 4914 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032209 4914 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032216 4914 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032224 4914 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032231 4914 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032241 4914 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032253 4914 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032263 4914 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032273 4914 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032283 4914 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032292 4914 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032301 4914 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032310 4914 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032319 4914 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032329 4914 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032339 4914 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032346 4914 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032355 4914 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032363 4914 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032371 4914 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032378 4914 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032386 4914 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032394 4914 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032401 4914 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032409 4914 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032416 4914 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032424 4914 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032432 4914 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032439 4914 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032447 4914 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032454 4914 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032462 4914 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032469 4914 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032477 4914 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032485 4914 feature_gate.go:330] unrecognized feature gate: Example Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032493 4914 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032500 4914 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032507 4914 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032516 4914 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032524 4914 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032532 4914 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032539 4914 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032546 4914 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032554 4914 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032562 4914 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032570 4914 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032579 4914 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032587 4914 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032595 4914 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032602 4914 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032611 4914 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032619 4914 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.032627 4914 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.032639 4914 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.033974 4914 server.go:940] "Client rotation is on, will bootstrap in background" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.039568 4914 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.039687 4914 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.041359 4914 server.go:997] "Starting client certificate rotation" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.041410 4914 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.042506 4914 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-27 06:30:19.378311538 +0000 UTC Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.042598 4914 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.069633 4914 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 21:14:27 crc kubenswrapper[4914]: E0130 21:14:27.072353 4914 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.074456 4914 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.637340 4914 log.go:25] "Validated CRI v1 runtime API" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.675247 4914 log.go:25] "Validated CRI v1 image API" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.677794 4914 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.687538 4914 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-30-21-09-28-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.687592 4914 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.713764 4914 manager.go:217] Machine: {Timestamp:2026-01-30 21:14:27.710001427 +0000 UTC m=+1.148638268 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:04fc677e-7e41-47a1-8a02-3259b15b63c4 BootID:f33c804c-e82d-481d-b93f-218591a98a10 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:57:ab:b2 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:57:ab:b2 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:2e:9a:08 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:6e:dd:a8 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d2:e6:56 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:fc:f7:c0 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3a:2e:af:d6:7d:6f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:e6:8f:33:b8:51:b4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.714165 4914 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.714409 4914 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.714936 4914 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.715225 4914 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.715274 4914 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.716186 4914 topology_manager.go:138] "Creating topology manager with none policy" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.716216 4914 container_manager_linux.go:303] "Creating device plugin manager" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.716811 4914 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.716852 4914 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.717075 4914 state_mem.go:36] "Initialized new in-memory state store" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.717207 4914 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.722933 4914 kubelet.go:418] "Attempting to sync node with API server" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.723159 4914 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.723194 4914 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.723215 4914 kubelet.go:324] "Adding apiserver pod source" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.723233 4914 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.728572 4914 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.729570 4914 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.729680 4914 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.729571 4914 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Jan 30 21:14:27 crc kubenswrapper[4914]: E0130 21:14:27.731803 4914 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:14:27 crc kubenswrapper[4914]: E0130 21:14:27.729681 4914 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.735263 4914 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.737166 4914 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.737241 4914 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.737260 4914 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.737278 4914 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.737305 4914 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.737321 4914 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.737338 4914 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.737364 4914 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.737384 4914 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.737402 4914 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.737424 4914 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.737441 4914 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.738814 4914 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.739878 4914 server.go:1280] "Started kubelet" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.741115 4914 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.741129 4914 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.741161 4914 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.742090 4914 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 30 21:14:27 crc systemd[1]: Started Kubernetes Kubelet. Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.744243 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.744287 4914 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.744430 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 10:57:39.336588322 +0000 UTC Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.744571 4914 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.744590 4914 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 30 21:14:27 crc kubenswrapper[4914]: E0130 21:14:27.744589 4914 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.744638 4914 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.745274 4914 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Jan 30 21:14:27 crc kubenswrapper[4914]: E0130 21:14:27.745375 4914 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.747151 4914 factory.go:55] Registering systemd factory Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.747186 4914 factory.go:221] Registration of the systemd container factory successfully Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.747568 4914 factory.go:153] Registering CRI-O factory Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.747601 4914 factory.go:221] Registration of the crio container factory successfully Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.747699 4914 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.747778 4914 factory.go:103] Registering Raw factory Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.747805 4914 manager.go:1196] Started watching for new ooms in manager Jan 30 21:14:27 crc kubenswrapper[4914]: E0130 21:14:27.748376 4914 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="200ms" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.748773 4914 server.go:460] "Adding debug handlers to kubelet server" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.749153 4914 manager.go:319] Starting recovery of all containers Jan 30 21:14:27 crc kubenswrapper[4914]: E0130 21:14:27.749329 4914 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.74:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188f9eb26c64efc6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 21:14:27.739815878 +0000 UTC m=+1.178452699,LastTimestamp:2026-01-30 21:14:27.739815878 +0000 UTC m=+1.178452699,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.761569 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.761690 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.761763 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.761802 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.761828 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.761870 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.761898 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.761926 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.761969 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.761998 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762075 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762112 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762139 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762185 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762206 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762233 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762252 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762278 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762300 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762320 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762345 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762365 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762431 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762457 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762479 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762503 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762533 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762555 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762582 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762602 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762623 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762648 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762667 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762691 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762745 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762772 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762799 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762821 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762847 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762868 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762888 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762923 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762944 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762969 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.762990 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.763010 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.763037 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.763058 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.763083 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.763103 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.763123 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.763148 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.763176 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.763205 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.763234 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.764447 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.764568 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.764639 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.764774 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.764834 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.764865 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.764917 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.764942 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.764985 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.765014 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.765035 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.765062 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770015 4914 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770110 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770147 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770175 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770204 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770261 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770288 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770314 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770341 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770367 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770425 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770452 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770482 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770508 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770532 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770559 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770587 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770614 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770641 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770668 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770694 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770753 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770784 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770810 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770836 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770863 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770889 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770920 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770947 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.770974 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771000 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771028 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771056 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771085 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771111 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771139 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771166 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771193 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771239 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771320 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771353 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771387 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771423 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771455 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771486 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771518 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771548 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771578 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771607 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771639 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771670 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771702 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771769 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771798 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771829 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771856 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771882 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.771989 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.772025 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.772057 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.772086 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.772118 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.772146 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.772172 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.772203 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.772233 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.772260 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.772288 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.772876 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.772920 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.772951 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.772979 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773005 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773033 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773112 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773139 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773166 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773193 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773218 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773248 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773274 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773303 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773330 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773355 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773381 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773409 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773436 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773463 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773489 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773514 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773548 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773575 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773602 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773628 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773656 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773683 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773746 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773780 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773814 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773842 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773871 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773898 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773927 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773955 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.773984 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774012 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774056 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774084 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774112 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774141 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774169 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774195 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774223 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774252 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774279 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774307 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774334 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774361 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774425 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774452 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774479 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774508 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774542 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774572 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774603 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774635 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774663 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774695 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774797 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774828 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774856 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774886 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774914 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774942 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.774970 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.775002 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.775032 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.775059 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.775090 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.775117 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.775145 4914 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.775171 4914 reconstruct.go:97] "Volume reconstruction finished" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.775188 4914 reconciler.go:26] "Reconciler: start to sync state" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.783636 4914 manager.go:324] Recovery completed Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.801392 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.803181 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.803225 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.803242 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.805032 4914 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.805066 4914 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.805099 4914 state_mem.go:36] "Initialized new in-memory state store" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.813544 4914 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.816594 4914 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.816675 4914 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.816749 4914 kubelet.go:2335] "Starting kubelet main sync loop" Jan 30 21:14:27 crc kubenswrapper[4914]: E0130 21:14:27.816839 4914 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 30 21:14:27 crc kubenswrapper[4914]: W0130 21:14:27.817316 4914 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Jan 30 21:14:27 crc kubenswrapper[4914]: E0130 21:14:27.817390 4914 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.825853 4914 policy_none.go:49] "None policy: Start" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.830635 4914 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.830676 4914 state_mem.go:35] "Initializing new in-memory state store" Jan 30 21:14:27 crc kubenswrapper[4914]: E0130 21:14:27.844786 4914 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.896683 4914 manager.go:334] "Starting Device Plugin manager" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.896887 4914 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.896953 4914 server.go:79] "Starting device plugin registration server" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.897946 4914 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.897981 4914 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.898841 4914 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.899161 4914 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.899228 4914 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 30 21:14:27 crc kubenswrapper[4914]: E0130 21:14:27.910231 4914 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.917870 4914 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.917993 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.919932 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.919976 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.919993 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.920186 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.920473 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.920539 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.921663 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.921738 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.921756 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.921934 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.922097 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.922173 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.922232 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.922275 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.922338 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.924114 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.924155 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.924176 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.924239 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.924266 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.924282 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.924428 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.924624 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.924683 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.925463 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.925505 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.925527 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.925701 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.925753 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.925864 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.925901 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.925760 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.926334 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.927054 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.927100 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.927113 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.927143 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.927160 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.927123 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.927482 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.927535 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.929126 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.929168 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.929185 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:27 crc kubenswrapper[4914]: E0130 21:14:27.949768 4914 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="400ms" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.976975 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.977036 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.977071 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.977103 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.977135 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.977174 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.977268 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.977356 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.977410 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.977456 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.977499 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.977546 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.977598 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.977642 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.977726 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:27 crc kubenswrapper[4914]: I0130 21:14:27.998436 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:27.999996 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.000053 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.000071 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.000115 4914 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 21:14:28 crc kubenswrapper[4914]: E0130 21:14:28.000819 4914 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.74:6443: connect: connection refused" node="crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.078918 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.079797 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.079809 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.079919 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.079967 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.079980 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080087 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080048 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080058 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080002 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080189 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080256 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080297 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080342 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080383 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080425 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080433 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080470 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080433 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080379 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080468 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080528 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080544 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080558 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080579 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080692 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080383 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080804 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080618 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.080612 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.201614 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.203195 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.203246 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.203262 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.203298 4914 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 21:14:28 crc kubenswrapper[4914]: E0130 21:14:28.203964 4914 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.74:6443: connect: connection refused" node="crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.263817 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.288387 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.299818 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.323569 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: W0130 21:14:28.323687 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-cbf9a5b05ec1caddec11ba63bdc8ddad596a33e0a0727132aeadaf0ad57d34ff WatchSource:0}: Error finding container cbf9a5b05ec1caddec11ba63bdc8ddad596a33e0a0727132aeadaf0ad57d34ff: Status 404 returned error can't find the container with id cbf9a5b05ec1caddec11ba63bdc8ddad596a33e0a0727132aeadaf0ad57d34ff Jan 30 21:14:28 crc kubenswrapper[4914]: W0130 21:14:28.328635 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-6b07c74766e321b43eb54b46203ede30272baf2bb62b12667b4d8ae43ba7e81e WatchSource:0}: Error finding container 6b07c74766e321b43eb54b46203ede30272baf2bb62b12667b4d8ae43ba7e81e: Status 404 returned error can't find the container with id 6b07c74766e321b43eb54b46203ede30272baf2bb62b12667b4d8ae43ba7e81e Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.331214 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:14:28 crc kubenswrapper[4914]: W0130 21:14:28.334176 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a2c101c6b7177b91c330cb70b13a3db3294026b26d8778d9d1f6288902592d91 WatchSource:0}: Error finding container a2c101c6b7177b91c330cb70b13a3db3294026b26d8778d9d1f6288902592d91: Status 404 returned error can't find the container with id a2c101c6b7177b91c330cb70b13a3db3294026b26d8778d9d1f6288902592d91 Jan 30 21:14:28 crc kubenswrapper[4914]: W0130 21:14:28.345983 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-69d42fb73fa8893ca398713dcb93c608cd5ad0e1ae157095bfff91196a16146f WatchSource:0}: Error finding container 69d42fb73fa8893ca398713dcb93c608cd5ad0e1ae157095bfff91196a16146f: Status 404 returned error can't find the container with id 69d42fb73fa8893ca398713dcb93c608cd5ad0e1ae157095bfff91196a16146f Jan 30 21:14:28 crc kubenswrapper[4914]: E0130 21:14:28.351666 4914 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="800ms" Jan 30 21:14:28 crc kubenswrapper[4914]: W0130 21:14:28.352438 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-92ac4b1a4206d7a6413f9d1e3efeaa3f2616220a158e7c65c5721bac9b6ac8dc WatchSource:0}: Error finding container 92ac4b1a4206d7a6413f9d1e3efeaa3f2616220a158e7c65c5721bac9b6ac8dc: Status 404 returned error can't find the container with id 92ac4b1a4206d7a6413f9d1e3efeaa3f2616220a158e7c65c5721bac9b6ac8dc Jan 30 21:14:28 crc kubenswrapper[4914]: W0130 21:14:28.601461 4914 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Jan 30 21:14:28 crc kubenswrapper[4914]: E0130 21:14:28.601550 4914 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.604793 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.606128 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.606158 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.606170 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.606194 4914 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 21:14:28 crc kubenswrapper[4914]: E0130 21:14:28.606504 4914 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.74:6443: connect: connection refused" node="crc" Jan 30 21:14:28 crc kubenswrapper[4914]: E0130 21:14:28.671431 4914 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.74:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188f9eb26c64efc6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 21:14:27.739815878 +0000 UTC m=+1.178452699,LastTimestamp:2026-01-30 21:14:27.739815878 +0000 UTC m=+1.178452699,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.742367 4914 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.745371 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 15:06:51.223335741 +0000 UTC Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.822010 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"69d42fb73fa8893ca398713dcb93c608cd5ad0e1ae157095bfff91196a16146f"} Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.823311 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a2c101c6b7177b91c330cb70b13a3db3294026b26d8778d9d1f6288902592d91"} Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.824749 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6b07c74766e321b43eb54b46203ede30272baf2bb62b12667b4d8ae43ba7e81e"} Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.825816 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cbf9a5b05ec1caddec11ba63bdc8ddad596a33e0a0727132aeadaf0ad57d34ff"} Jan 30 21:14:28 crc kubenswrapper[4914]: I0130 21:14:28.827459 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"92ac4b1a4206d7a6413f9d1e3efeaa3f2616220a158e7c65c5721bac9b6ac8dc"} Jan 30 21:14:28 crc kubenswrapper[4914]: W0130 21:14:28.877284 4914 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Jan 30 21:14:28 crc kubenswrapper[4914]: E0130 21:14:28.877560 4914 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.078537 4914 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 21:14:29 crc kubenswrapper[4914]: E0130 21:14:29.082579 4914 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:14:29 crc kubenswrapper[4914]: W0130 21:14:29.098272 4914 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Jan 30 21:14:29 crc kubenswrapper[4914]: E0130 21:14:29.098381 4914 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:14:29 crc kubenswrapper[4914]: E0130 21:14:29.152740 4914 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="1.6s" Jan 30 21:14:29 crc kubenswrapper[4914]: W0130 21:14:29.310280 4914 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Jan 30 21:14:29 crc kubenswrapper[4914]: E0130 21:14:29.310413 4914 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.407292 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.409183 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.409248 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.409267 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.409306 4914 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 21:14:29 crc kubenswrapper[4914]: E0130 21:14:29.409994 4914 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.74:6443: connect: connection refused" node="crc" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.743006 4914 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.745522 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 10:47:31.208972099 +0000 UTC Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.833628 4914 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3" exitCode=0 Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.833825 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3"} Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.833872 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.835509 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.835550 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.835568 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.836931 4914 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="801e1d86b518b18af908f48c442135881fee4749371a2d50f5232a4eb9a4eb62" exitCode=0 Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.837026 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.837063 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"801e1d86b518b18af908f48c442135881fee4749371a2d50f5232a4eb9a4eb62"} Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.838337 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.838380 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.838397 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.842857 4914 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735" exitCode=0 Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.842964 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735"} Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.843036 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.844263 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.844307 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.844321 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.847427 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f"} Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.847456 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf"} Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.847475 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a"} Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.850279 4914 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b" exitCode=0 Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.850347 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b"} Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.850555 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.851774 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.851820 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.851838 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.860425 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.862351 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.862418 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:29 crc kubenswrapper[4914]: I0130 21:14:29.862442 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:30 crc kubenswrapper[4914]: W0130 21:14:30.473338 4914 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Jan 30 21:14:30 crc kubenswrapper[4914]: E0130 21:14:30.473421 4914 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.742666 4914 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.745829 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 05:47:24.390111661 +0000 UTC Jan 30 21:14:30 crc kubenswrapper[4914]: E0130 21:14:30.753814 4914 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="3.2s" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.855778 4914 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468" exitCode=0 Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.855853 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468"} Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.856750 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.858218 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.858250 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.858263 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.858725 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"88f74a072a31d862caee808486ce40398646a7edd7a44143f51258af9e3619be"} Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.858801 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.859670 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.859692 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.859722 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.864689 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"baf15cc7ec88d6b3f78cf54a42cef4f7082519e8256fdade2d4882cd4a879f1e"} Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.864731 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ba1aff6c5242fbe5e0d1e6c200c68b781af18f97900c3464e114bebb27a500f6"} Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.864744 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f8759bf42864facd5f47819968351923fb2c65ccf597f6cf9ff7c60d9e3b036e"} Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.864823 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.865607 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.865641 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.865656 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.871390 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac"} Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.871617 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.873393 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.873569 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.873586 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.875884 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f"} Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.875927 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b"} Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.875947 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d"} Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.875963 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f"} Jan 30 21:14:30 crc kubenswrapper[4914]: I0130 21:14:30.921122 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.010420 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.011399 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.011441 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.011453 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.011481 4914 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 21:14:31 crc kubenswrapper[4914]: E0130 21:14:31.011964 4914 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.74:6443: connect: connection refused" node="crc" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.033567 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:31 crc kubenswrapper[4914]: W0130 21:14:31.250198 4914 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.74:6443: connect: connection refused Jan 30 21:14:31 crc kubenswrapper[4914]: E0130 21:14:31.250277 4914 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.74:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.745924 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 03:52:46.290159095 +0000 UTC Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.883958 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b"} Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.884082 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.885781 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.885831 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.885853 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.887471 4914 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed" exitCode=0 Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.887511 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed"} Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.887581 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.887611 4914 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.887627 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.887647 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.888104 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.888952 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.888982 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.888999 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.889485 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.889693 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.889921 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.889588 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.890134 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.890155 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.889629 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.890202 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:31 crc kubenswrapper[4914]: I0130 21:14:31.890216 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.746368 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 09:57:40.510987911 +0000 UTC Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.785603 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.895043 4914 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.895133 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.895194 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.895274 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711"} Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.895768 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2"} Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.895798 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3"} Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.895829 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.896629 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.896670 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.896685 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.896641 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.896795 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.896812 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.896933 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.896977 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:32 crc kubenswrapper[4914]: I0130 21:14:32.896996 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.373341 4914 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.667598 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.746538 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 03:07:47.532321486 +0000 UTC Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.904983 4914 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.905061 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.905911 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958"} Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.906065 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.906099 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28"} Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.906857 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.906921 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.906942 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.907825 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.907878 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.907903 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.921797 4914 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 21:14:33 crc kubenswrapper[4914]: I0130 21:14:33.921881 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 21:14:34 crc kubenswrapper[4914]: I0130 21:14:34.212883 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:34 crc kubenswrapper[4914]: I0130 21:14:34.214357 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:34 crc kubenswrapper[4914]: I0130 21:14:34.214445 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:34 crc kubenswrapper[4914]: I0130 21:14:34.214471 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:34 crc kubenswrapper[4914]: I0130 21:14:34.214518 4914 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 21:14:34 crc kubenswrapper[4914]: I0130 21:14:34.747533 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 20:00:42.773861129 +0000 UTC Jan 30 21:14:34 crc kubenswrapper[4914]: I0130 21:14:34.907883 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:34 crc kubenswrapper[4914]: I0130 21:14:34.909116 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:34 crc kubenswrapper[4914]: I0130 21:14:34.909159 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:34 crc kubenswrapper[4914]: I0130 21:14:34.909176 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:35 crc kubenswrapper[4914]: I0130 21:14:35.616299 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:35 crc kubenswrapper[4914]: I0130 21:14:35.616488 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:35 crc kubenswrapper[4914]: I0130 21:14:35.620359 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:35 crc kubenswrapper[4914]: I0130 21:14:35.620445 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:35 crc kubenswrapper[4914]: I0130 21:14:35.620460 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:35 crc kubenswrapper[4914]: I0130 21:14:35.624027 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:35 crc kubenswrapper[4914]: I0130 21:14:35.748162 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 07:29:47.033719722 +0000 UTC Jan 30 21:14:35 crc kubenswrapper[4914]: I0130 21:14:35.910874 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:35 crc kubenswrapper[4914]: I0130 21:14:35.912213 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:35 crc kubenswrapper[4914]: I0130 21:14:35.912276 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:35 crc kubenswrapper[4914]: I0130 21:14:35.912296 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.261616 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.261899 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.263328 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.263370 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.263387 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.299113 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.525254 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.525522 4914 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.525655 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.527374 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.527416 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.527432 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.662790 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.749028 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 11:43:15.867573983 +0000 UTC Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.914201 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.914209 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.915821 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.915877 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.915894 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.916659 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.916743 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:36 crc kubenswrapper[4914]: I0130 21:14:36.916766 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:37 crc kubenswrapper[4914]: I0130 21:14:37.749973 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 04:21:23.279676583 +0000 UTC Jan 30 21:14:37 crc kubenswrapper[4914]: E0130 21:14:37.910396 4914 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 30 21:14:38 crc kubenswrapper[4914]: I0130 21:14:38.145552 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 30 21:14:38 crc kubenswrapper[4914]: I0130 21:14:38.145845 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:38 crc kubenswrapper[4914]: I0130 21:14:38.147573 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:38 crc kubenswrapper[4914]: I0130 21:14:38.147629 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:38 crc kubenswrapper[4914]: I0130 21:14:38.147653 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:38 crc kubenswrapper[4914]: I0130 21:14:38.750255 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 05:20:58.703064532 +0000 UTC Jan 30 21:14:39 crc kubenswrapper[4914]: I0130 21:14:39.751112 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 12:20:40.5848167 +0000 UTC Jan 30 21:14:40 crc kubenswrapper[4914]: I0130 21:14:40.751750 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 13:53:18.07330227 +0000 UTC Jan 30 21:14:41 crc kubenswrapper[4914]: I0130 21:14:41.043335 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:41 crc kubenswrapper[4914]: I0130 21:14:41.043493 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:41 crc kubenswrapper[4914]: I0130 21:14:41.045001 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:41 crc kubenswrapper[4914]: I0130 21:14:41.045040 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:41 crc kubenswrapper[4914]: I0130 21:14:41.045058 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:41 crc kubenswrapper[4914]: W0130 21:14:41.424861 4914 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 30 21:14:41 crc kubenswrapper[4914]: I0130 21:14:41.424981 4914 trace.go:236] Trace[592975109]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 21:14:31.424) (total time: 10000ms): Jan 30 21:14:41 crc kubenswrapper[4914]: Trace[592975109]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (21:14:41.424) Jan 30 21:14:41 crc kubenswrapper[4914]: Trace[592975109]: [10.000922313s] [10.000922313s] END Jan 30 21:14:41 crc kubenswrapper[4914]: E0130 21:14:41.425018 4914 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 30 21:14:41 crc kubenswrapper[4914]: I0130 21:14:41.743637 4914 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 30 21:14:41 crc kubenswrapper[4914]: I0130 21:14:41.753039 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 22:12:26.516139057 +0000 UTC Jan 30 21:14:42 crc kubenswrapper[4914]: I0130 21:14:42.318335 4914 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Jan 30 21:14:42 crc kubenswrapper[4914]: I0130 21:14:42.318417 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 30 21:14:42 crc kubenswrapper[4914]: I0130 21:14:42.344025 4914 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]log ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]etcd ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/generic-apiserver-start-informers ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/priority-and-fairness-filter ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/start-apiextensions-informers ok Jan 30 21:14:42 crc kubenswrapper[4914]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Jan 30 21:14:42 crc kubenswrapper[4914]: [-]poststarthook/crd-informer-synced failed: reason withheld Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/start-system-namespaces-controller ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 30 21:14:42 crc kubenswrapper[4914]: [-]poststarthook/start-service-ip-repair-controllers failed: reason withheld Jan 30 21:14:42 crc kubenswrapper[4914]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jan 30 21:14:42 crc kubenswrapper[4914]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Jan 30 21:14:42 crc kubenswrapper[4914]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Jan 30 21:14:42 crc kubenswrapper[4914]: [-]poststarthook/bootstrap-controller failed: reason withheld Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/start-kube-aggregator-informers ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 30 21:14:42 crc kubenswrapper[4914]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 30 21:14:42 crc kubenswrapper[4914]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]autoregister-completion ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/apiservice-openapi-controller ok Jan 30 21:14:42 crc kubenswrapper[4914]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 30 21:14:42 crc kubenswrapper[4914]: livez check failed Jan 30 21:14:42 crc kubenswrapper[4914]: I0130 21:14:42.344105 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:14:42 crc kubenswrapper[4914]: I0130 21:14:42.753824 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 01:17:44.054764651 +0000 UTC Jan 30 21:14:43 crc kubenswrapper[4914]: I0130 21:14:43.754178 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 05:22:43.312500402 +0000 UTC Jan 30 21:14:43 crc kubenswrapper[4914]: I0130 21:14:43.921362 4914 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 21:14:43 crc kubenswrapper[4914]: I0130 21:14:43.921457 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 21:14:44 crc kubenswrapper[4914]: I0130 21:14:44.755204 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 05:20:26.57055248 +0000 UTC Jan 30 21:14:45 crc kubenswrapper[4914]: I0130 21:14:45.446926 4914 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 30 21:14:45 crc kubenswrapper[4914]: I0130 21:14:45.756366 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 22:19:29.355629754 +0000 UTC Jan 30 21:14:46 crc kubenswrapper[4914]: I0130 21:14:46.531572 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:46 crc kubenswrapper[4914]: I0130 21:14:46.531820 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:46 crc kubenswrapper[4914]: I0130 21:14:46.533338 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:46 crc kubenswrapper[4914]: I0130 21:14:46.533415 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:46 crc kubenswrapper[4914]: I0130 21:14:46.533437 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:46 crc kubenswrapper[4914]: I0130 21:14:46.538896 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:46 crc kubenswrapper[4914]: I0130 21:14:46.757428 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 21:43:17.706796674 +0000 UTC Jan 30 21:14:46 crc kubenswrapper[4914]: I0130 21:14:46.944977 4914 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:14:46 crc kubenswrapper[4914]: I0130 21:14:46.946291 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:46 crc kubenswrapper[4914]: I0130 21:14:46.946360 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:46 crc kubenswrapper[4914]: I0130 21:14:46.946384 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.317823 4914 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.320654 4914 trace.go:236] Trace[75781092]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 21:14:35.365) (total time: 11954ms): Jan 30 21:14:47 crc kubenswrapper[4914]: Trace[75781092]: ---"Objects listed" error: 11954ms (21:14:47.320) Jan 30 21:14:47 crc kubenswrapper[4914]: Trace[75781092]: [11.954672104s] [11.954672104s] END Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.320752 4914 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.320896 4914 trace.go:236] Trace[314911435]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 21:14:34.267) (total time: 13053ms): Jan 30 21:14:47 crc kubenswrapper[4914]: Trace[314911435]: ---"Objects listed" error: 13053ms (21:14:47.320) Jan 30 21:14:47 crc kubenswrapper[4914]: Trace[314911435]: [13.053537632s] [13.053537632s] END Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.320946 4914 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.323144 4914 trace.go:236] Trace[968966036]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 21:14:32.366) (total time: 14956ms): Jan 30 21:14:47 crc kubenswrapper[4914]: Trace[968966036]: ---"Objects listed" error: 14955ms (21:14:47.322) Jan 30 21:14:47 crc kubenswrapper[4914]: Trace[968966036]: [14.956162334s] [14.956162334s] END Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.323187 4914 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.323365 4914 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.336046 4914 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.336379 4914 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.339022 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.339081 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.339100 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.339138 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.339164 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:47Z","lastTransitionTime":"2026-01-30T21:14:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.366077 4914 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.377234 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.382862 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.382933 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.382952 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.382992 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.383013 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:47Z","lastTransitionTime":"2026-01-30T21:14:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.397108 4914 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58916->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.397178 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58916->192.168.126.11:17697: read: connection reset by peer" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.397556 4914 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.397601 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.397937 4914 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.398002 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.401229 4914 csr.go:261] certificate signing request csr-p8d99 is approved, waiting to be issued Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.401665 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.405168 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.405225 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.405241 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.405268 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.405281 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:47Z","lastTransitionTime":"2026-01-30T21:14:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.429029 4914 csr.go:257] certificate signing request csr-p8d99 is issued Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.429647 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.434555 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.434596 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.434607 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.434647 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.434662 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:47Z","lastTransitionTime":"2026-01-30T21:14:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.457994 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.462999 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.463048 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.463058 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.463084 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.463097 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:47Z","lastTransitionTime":"2026-01-30T21:14:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.476057 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.476164 4914 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.478242 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.478274 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.478283 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.478305 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.478321 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:47Z","lastTransitionTime":"2026-01-30T21:14:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.580753 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.580784 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.580794 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.580812 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.580821 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:47Z","lastTransitionTime":"2026-01-30T21:14:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.683563 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.683617 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.683631 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.683655 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.683665 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:47Z","lastTransitionTime":"2026-01-30T21:14:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.734928 4914 apiserver.go:52] "Watching apiserver" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.738144 4914 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.738532 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-pm2tg","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-zxtk5"] Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.739030 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.739195 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.739305 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.739366 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.739459 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.739529 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.739603 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.739962 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zxtk5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.740063 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.740078 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.740103 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.743107 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.743255 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.743379 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.743495 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.743598 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.743877 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.746070 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.746163 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.746256 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.747134 4914 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.747371 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.747552 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.747642 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.747701 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.747791 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.747908 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.747953 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.748324 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.757875 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 14:50:59.704924411 +0000 UTC Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.762865 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.775553 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.785513 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.785672 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.785791 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.785924 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.786025 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:47Z","lastTransitionTime":"2026-01-30T21:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.786761 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.794393 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.806847 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.815982 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.823444 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.826557 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.826625 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.826664 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.826698 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.826743 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.826768 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.826792 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.826814 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.826844 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.826866 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.826891 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.826957 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.826983 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827010 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827035 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827061 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827086 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827113 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827138 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827162 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827187 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827210 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827232 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827254 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827360 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827385 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827410 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827432 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827456 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827479 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827505 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827531 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827552 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827578 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827604 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827627 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827651 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827674 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827697 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827740 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827772 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827797 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827821 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827845 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827867 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827891 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827915 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827938 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827964 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827988 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828013 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828035 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828060 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828083 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828105 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828126 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828151 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828173 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828198 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828221 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828244 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828265 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828287 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828314 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828338 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828360 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828381 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828404 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828433 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828456 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828479 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828501 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828523 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828546 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828568 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828591 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828615 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828637 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828659 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828683 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828723 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828749 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828797 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828822 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828845 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827081 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827194 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827238 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827233 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828949 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828871 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829016 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829062 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829104 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829142 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829179 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829217 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829254 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829291 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829327 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829363 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829397 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829433 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829468 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829504 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829539 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829575 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829612 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829652 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829687 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829751 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829810 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829845 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829882 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829918 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829950 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829982 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830014 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830049 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830094 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830147 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830192 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830231 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830267 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830304 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830338 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830371 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830405 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830439 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830475 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830513 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830551 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830586 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830621 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830655 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830688 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.831396 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.831569 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.831604 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.831643 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.831685 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.831743 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.831776 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.831814 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.831851 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.831894 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.831931 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.831969 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832003 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832040 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832077 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832115 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832150 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832185 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832221 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832257 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832296 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832332 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832367 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832401 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832436 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832472 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832511 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832548 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832583 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832620 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832655 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832693 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832780 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832822 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832858 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832902 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832936 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832969 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.833002 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.833039 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.833075 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.833112 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827271 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827389 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827554 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827557 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827643 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827694 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827835 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827856 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.827986 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828122 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828163 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828295 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828429 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828525 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828573 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828625 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828670 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828862 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828854 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.828997 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829128 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829144 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829395 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829403 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829755 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.829800 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830169 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830533 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830537 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.830972 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.831254 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.831335 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832009 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.833501 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832117 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.833523 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832555 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832676 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832661 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832840 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832877 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.833074 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.833114 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.833238 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.833540 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.832651 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.833637 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.834065 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.834076 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.834247 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.834282 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.834297 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.834767 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.835097 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.835166 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.835977 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.836398 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.836465 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.836501 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.836748 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.836817 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.836928 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.837027 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.837176 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.837230 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.837377 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.837458 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.837620 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.837720 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.837748 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.837835 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.837906 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.837864 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.838051 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.838330 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839016 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.833154 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839179 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839200 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839248 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839291 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839336 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839371 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839406 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839443 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839482 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839516 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839548 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839582 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839616 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839650 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839681 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839763 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839797 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839829 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839863 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839896 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839929 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839968 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840047 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3be0c366-7d83-42e6-9a85-3f77ce72281f-mcd-auth-proxy-config\") pod \"machine-config-daemon-pm2tg\" (UID: \"3be0c366-7d83-42e6-9a85-3f77ce72281f\") " pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840108 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e82ab6e-8068-438b-9caa-f3d7028cbb5f-hosts-file\") pod \"node-resolver-zxtk5\" (UID: \"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\") " pod="openshift-dns/node-resolver-zxtk5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840165 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840212 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840253 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840288 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840323 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3be0c366-7d83-42e6-9a85-3f77ce72281f-rootfs\") pod \"machine-config-daemon-pm2tg\" (UID: \"3be0c366-7d83-42e6-9a85-3f77ce72281f\") " pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840358 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840398 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840435 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840473 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840517 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840554 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840591 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840632 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3be0c366-7d83-42e6-9a85-3f77ce72281f-proxy-tls\") pod \"machine-config-daemon-pm2tg\" (UID: \"3be0c366-7d83-42e6-9a85-3f77ce72281f\") " pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840669 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840768 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840812 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmmsz\" (UniqueName: \"kubernetes.io/projected/3be0c366-7d83-42e6-9a85-3f77ce72281f-kube-api-access-wmmsz\") pod \"machine-config-daemon-pm2tg\" (UID: \"3be0c366-7d83-42e6-9a85-3f77ce72281f\") " pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840846 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v84mz\" (UniqueName: \"kubernetes.io/projected/8e82ab6e-8068-438b-9caa-f3d7028cbb5f-kube-api-access-v84mz\") pod \"node-resolver-zxtk5\" (UID: \"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\") " pod="openshift-dns/node-resolver-zxtk5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840886 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840986 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.841010 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.841031 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.841051 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.841070 4914 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.841092 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.841114 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.841138 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.841158 4914 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.841179 4914 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.841197 4914 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.841215 4914 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839242 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839255 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839409 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.839656 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840100 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840776 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840887 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.840983 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.841234 4914 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.841312 4914 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.841919 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.842008 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.842069 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.842145 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.842381 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.842465 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.842656 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.843030 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.843511 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.843810 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.844074 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.844180 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.844207 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.844288 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.844601 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.844828 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.844929 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.845045 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.845788 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.845930 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.845962 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.846048 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.846114 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:48.346086645 +0000 UTC m=+21.784723416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.846126 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.846163 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.846812 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:14:48.346661659 +0000 UTC m=+21.785298420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.846902 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.847075 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.846980 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.847088 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.847280 4914 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.847356 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.847399 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.847543 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.847490 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.847836 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.848098 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.848439 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.848606 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.848645 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.848655 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.849189 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.849226 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.849237 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.849483 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.849663 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.849746 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.849861 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.850032 4914 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.850145 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:48.350124749 +0000 UTC m=+21.788761550 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.850261 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.850374 4914 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.850442 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.850683 4914 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.850759 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.850843 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.850976 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851343 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851369 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851410 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851425 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851439 4914 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851452 4914 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851493 4914 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851508 4914 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851522 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851537 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851581 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851596 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851610 4914 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851623 4914 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851659 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851673 4914 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851686 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851700 4914 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851747 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851760 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851774 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851814 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.851830 4914 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.852025 4914 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.852538 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.853819 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.855397 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.856472 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.856589 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.856916 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.857196 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.857328 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.857885 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.858464 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.859980 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.860231 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.865285 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.865324 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.865338 4914 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.865409 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:48.365376953 +0000 UTC m=+21.804013714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866800 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866851 4914 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866863 4914 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866876 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866886 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866896 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866907 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866918 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866928 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866939 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866951 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866963 4914 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866973 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866985 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.866994 4914 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867005 4914 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867015 4914 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867025 4914 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867036 4914 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867046 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867058 4914 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867068 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867078 4914 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867087 4914 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867100 4914 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867110 4914 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867121 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867133 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867142 4914 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867152 4914 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867163 4914 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867171 4914 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867181 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867190 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867199 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867208 4914 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867219 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867229 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867810 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.867863 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.868805 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.869028 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.872942 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.874264 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.875473 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.878046 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.878418 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.879136 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.879294 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.879387 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.879403 4914 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.879494 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:48.379470831 +0000 UTC m=+21.818107602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.880201 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.880476 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.880689 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.881024 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.881176 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.881340 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.881906 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.882685 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.883029 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.883074 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.883433 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.883753 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.884149 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.884177 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.884193 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.884372 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.884432 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.884462 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.884596 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.885212 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.885234 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.884655 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.885402 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.886971 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.885417 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.885613 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.885675 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.886095 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.886114 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.886018 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.886598 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.888162 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.888457 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.889201 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.889241 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.889281 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.889298 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.889321 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.889338 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:47Z","lastTransitionTime":"2026-01-30T21:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.892628 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.892760 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.892925 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.893055 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.893238 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.893828 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.893931 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.893967 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.895495 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.895720 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.896128 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-wvbd7"] Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.896425 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.899354 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.904348 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.905505 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.908383 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.908457 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.909061 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hchqc"] Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.910943 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.911589 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.911861 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.912670 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.912966 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.913251 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.913256 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-wt7n5"] Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.913498 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.913510 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.913563 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.913596 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.913666 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.913277 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.913854 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.914366 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-c2klk"] Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.914726 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.915449 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.916151 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:47 crc kubenswrapper[4914]: E0130 21:14:47.916254 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.917260 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.918076 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.925934 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.937209 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.946907 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.952233 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.954789 4914 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b" exitCode=255 Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.954826 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b"} Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.957485 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.965598 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.965868 4914 scope.go:117] "RemoveContainer" containerID="184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.967939 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-systemd\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.967986 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r27rl\" (UniqueName: \"kubernetes.io/projected/6a32fa1f-f3a9-4e60-b665-51138c3ce768-kube-api-access-r27rl\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968006 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-os-release\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968022 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-multus-cni-dir\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968040 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovnkube-script-lib\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968055 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h4k8\" (UniqueName: \"kubernetes.io/projected/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-kube-api-access-2h4k8\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968071 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-multus-socket-dir-parent\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968089 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-multus-conf-dir\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968111 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3be0c366-7d83-42e6-9a85-3f77ce72281f-proxy-tls\") pod \"machine-config-daemon-pm2tg\" (UID: \"3be0c366-7d83-42e6-9a85-3f77ce72281f\") " pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968128 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-cnibin\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968145 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-var-lib-cni-multus\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968186 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-system-cni-dir\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968204 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1067fc5-9bff-4a81-982f-b2cca1c432d0-cni-binary-copy\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968220 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-hostroot\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968240 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-openvswitch\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968269 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-etc-kubernetes\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968302 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e82ab6e-8068-438b-9caa-f3d7028cbb5f-hosts-file\") pod \"node-resolver-zxtk5\" (UID: \"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\") " pod="openshift-dns/node-resolver-zxtk5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968320 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-kubelet\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968338 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-etc-openvswitch\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968412 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-node-log\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968431 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-cni-netd\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968449 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968509 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e82ab6e-8068-438b-9caa-f3d7028cbb5f-hosts-file\") pod \"node-resolver-zxtk5\" (UID: \"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\") " pod="openshift-dns/node-resolver-zxtk5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968579 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-run-k8s-cni-cncf-io\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968608 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-var-lib-cni-bin\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968633 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tpkv\" (UniqueName: \"kubernetes.io/projected/c1067fc5-9bff-4a81-982f-b2cca1c432d0-kube-api-access-4tpkv\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968659 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968761 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-var-lib-openvswitch\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968786 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1067fc5-9bff-4a81-982f-b2cca1c432d0-multus-daemon-config\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968812 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3be0c366-7d83-42e6-9a85-3f77ce72281f-rootfs\") pod \"machine-config-daemon-pm2tg\" (UID: \"3be0c366-7d83-42e6-9a85-3f77ce72281f\") " pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968836 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-systemd-units\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968837 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968855 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-log-socket\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968882 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovnkube-config\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968911 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-run-netns\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968952 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfmb6\" (UniqueName: \"kubernetes.io/projected/8a911963-1d06-47d0-8f70-d81d5bd47496-kube-api-access-nfmb6\") pod \"network-metrics-daemon-c2klk\" (UID: \"8a911963-1d06-47d0-8f70-d81d5bd47496\") " pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968975 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-run-netns\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.968995 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-run-ovn-kubernetes\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969113 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-run-multus-certs\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969153 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-ovn\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969183 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969235 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-system-cni-dir\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969278 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3be0c366-7d83-42e6-9a85-3f77ce72281f-rootfs\") pod \"machine-config-daemon-pm2tg\" (UID: \"3be0c366-7d83-42e6-9a85-3f77ce72281f\") " pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969313 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-os-release\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969384 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-cni-binary-copy\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969494 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-cnibin\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969533 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmmsz\" (UniqueName: \"kubernetes.io/projected/3be0c366-7d83-42e6-9a85-3f77ce72281f-kube-api-access-wmmsz\") pod \"machine-config-daemon-pm2tg\" (UID: \"3be0c366-7d83-42e6-9a85-3f77ce72281f\") " pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969556 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v84mz\" (UniqueName: \"kubernetes.io/projected/8e82ab6e-8068-438b-9caa-f3d7028cbb5f-kube-api-access-v84mz\") pod \"node-resolver-zxtk5\" (UID: \"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\") " pod="openshift-dns/node-resolver-zxtk5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969578 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-cni-bin\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969600 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3be0c366-7d83-42e6-9a85-3f77ce72281f-mcd-auth-proxy-config\") pod \"machine-config-daemon-pm2tg\" (UID: \"3be0c366-7d83-42e6-9a85-3f77ce72281f\") " pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969596 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969621 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs\") pod \"network-metrics-daemon-c2klk\" (UID: \"8a911963-1d06-47d0-8f70-d81d5bd47496\") " pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969641 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969662 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-var-lib-kubelet\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969683 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-slash\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969724 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-env-overrides\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969763 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969817 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.969941 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovn-node-metrics-cert\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970032 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970043 4914 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970054 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970063 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970071 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970080 4914 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970088 4914 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970097 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970106 4914 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970115 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970124 4914 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970154 4914 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970175 4914 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970190 4914 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970202 4914 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970217 4914 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970229 4914 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970242 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970257 4914 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970270 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970284 4914 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970296 4914 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970308 4914 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970321 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970334 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970346 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970360 4914 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970372 4914 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970411 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970424 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970436 4914 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970450 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970462 4914 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970475 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970488 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970501 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970516 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970591 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3be0c366-7d83-42e6-9a85-3f77ce72281f-mcd-auth-proxy-config\") pod \"machine-config-daemon-pm2tg\" (UID: \"3be0c366-7d83-42e6-9a85-3f77ce72281f\") " pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970778 4914 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970799 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970813 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970825 4914 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970838 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970849 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970861 4914 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970873 4914 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970885 4914 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970898 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970911 4914 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970924 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970937 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970949 4914 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970961 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970973 4914 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970987 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.970999 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971012 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971024 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971036 4914 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971048 4914 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971062 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971076 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971093 4914 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971110 4914 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971124 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971135 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971147 4914 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971159 4914 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971192 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971203 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971215 4914 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971227 4914 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971241 4914 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971252 4914 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971264 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971276 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971288 4914 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971299 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971310 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971322 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971336 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971347 4914 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971361 4914 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971373 4914 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971384 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971398 4914 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971411 4914 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971423 4914 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971437 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971450 4914 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971462 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971386 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3be0c366-7d83-42e6-9a85-3f77ce72281f-proxy-tls\") pod \"machine-config-daemon-pm2tg\" (UID: \"3be0c366-7d83-42e6-9a85-3f77ce72281f\") " pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971475 4914 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971490 4914 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971505 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971519 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971533 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971547 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971558 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971569 4914 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971580 4914 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971591 4914 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971603 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971616 4914 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971628 4914 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971639 4914 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971653 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971666 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971679 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971693 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971725 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971743 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971767 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971782 4914 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.971795 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.980623 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.988398 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmmsz\" (UniqueName: \"kubernetes.io/projected/3be0c366-7d83-42e6-9a85-3f77ce72281f-kube-api-access-wmmsz\") pod \"machine-config-daemon-pm2tg\" (UID: \"3be0c366-7d83-42e6-9a85-3f77ce72281f\") " pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.990998 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v84mz\" (UniqueName: \"kubernetes.io/projected/8e82ab6e-8068-438b-9caa-f3d7028cbb5f-kube-api-access-v84mz\") pod \"node-resolver-zxtk5\" (UID: \"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\") " pod="openshift-dns/node-resolver-zxtk5" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.993040 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.993076 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.993089 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.993108 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.993121 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:47Z","lastTransitionTime":"2026-01-30T21:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:47 crc kubenswrapper[4914]: I0130 21:14:47.994944 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.003258 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.016820 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.050679 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.061193 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.065695 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.071128 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072199 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-cni-bin\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072235 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs\") pod \"network-metrics-daemon-c2klk\" (UID: \"8a911963-1d06-47d0-8f70-d81d5bd47496\") " pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072258 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072280 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-var-lib-kubelet\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072299 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-slash\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072320 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-env-overrides\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072341 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovn-node-metrics-cert\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072361 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-systemd\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072380 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r27rl\" (UniqueName: \"kubernetes.io/projected/6a32fa1f-f3a9-4e60-b665-51138c3ce768-kube-api-access-r27rl\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072400 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-os-release\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072420 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-multus-cni-dir\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072441 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovnkube-script-lib\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072459 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h4k8\" (UniqueName: \"kubernetes.io/projected/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-kube-api-access-2h4k8\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072479 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-multus-socket-dir-parent\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072497 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-multus-conf-dir\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072517 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-cnibin\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072536 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-var-lib-cni-multus\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072566 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-system-cni-dir\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072585 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1067fc5-9bff-4a81-982f-b2cca1c432d0-cni-binary-copy\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072604 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-hostroot\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072622 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-openvswitch\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072663 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-etc-kubernetes\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072691 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-kubelet\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072725 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-etc-openvswitch\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072743 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-node-log\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072764 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-cni-netd\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072784 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-run-k8s-cni-cncf-io\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072803 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-var-lib-cni-bin\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072823 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tpkv\" (UniqueName: \"kubernetes.io/projected/c1067fc5-9bff-4a81-982f-b2cca1c432d0-kube-api-access-4tpkv\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072842 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072866 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-var-lib-openvswitch\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072888 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1067fc5-9bff-4a81-982f-b2cca1c432d0-multus-daemon-config\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072889 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072909 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-systemd-units\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072932 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-log-socket\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072953 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovnkube-config\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.072969 4914 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.073008 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs podName:8a911963-1d06-47d0-8f70-d81d5bd47496 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:48.572995077 +0000 UTC m=+22.011631838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs") pod "network-metrics-daemon-c2klk" (UID: "8a911963-1d06-47d0-8f70-d81d5bd47496") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073014 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-run-netns\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073056 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-var-lib-kubelet\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073086 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-slash\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072972 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-run-netns\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073191 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfmb6\" (UniqueName: \"kubernetes.io/projected/8a911963-1d06-47d0-8f70-d81d5bd47496-kube-api-access-nfmb6\") pod \"network-metrics-daemon-c2klk\" (UID: \"8a911963-1d06-47d0-8f70-d81d5bd47496\") " pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073209 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-run-netns\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073225 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-run-ovn-kubernetes\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073241 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-run-multus-certs\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073257 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-ovn\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073284 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073301 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-system-cni-dir\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073336 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-os-release\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073366 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-cni-binary-copy\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073387 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-cnibin\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073365 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073448 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-cnibin\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073562 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-run-netns\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073586 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-run-ovn-kubernetes\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073606 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-run-multus-certs\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073624 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-ovn\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073325 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-hostroot\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073672 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-env-overrides\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073792 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-os-release\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073826 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-multus-socket-dir-parent\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073918 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-multus-conf-dir\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073973 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-cnibin\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.073982 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074025 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-var-lib-openvswitch\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074023 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-var-lib-cni-multus\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074063 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-system-cni-dir\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074085 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-cni-binary-copy\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074124 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-log-socket\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074148 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-systemd-units\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074365 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovnkube-script-lib\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.072318 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-cni-bin\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074611 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074729 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovnkube-config\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074767 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-systemd\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074794 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-etc-openvswitch\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074816 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-openvswitch\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074825 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1067fc5-9bff-4a81-982f-b2cca1c432d0-multus-daemon-config\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074843 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-etc-kubernetes\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074871 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-kubelet\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074889 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-system-cni-dir\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074915 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-node-log\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074902 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-os-release\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074928 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-run-k8s-cni-cncf-io\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074898 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-cni-netd\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.074983 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1067fc5-9bff-4a81-982f-b2cca1c432d0-cni-binary-copy\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.075054 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-multus-cni-dir\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.075055 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1067fc5-9bff-4a81-982f-b2cca1c432d0-host-var-lib-cni-bin\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.080051 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.080730 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovn-node-metrics-cert\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.087763 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.089248 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r27rl\" (UniqueName: \"kubernetes.io/projected/6a32fa1f-f3a9-4e60-b665-51138c3ce768-kube-api-access-r27rl\") pod \"ovnkube-node-hchqc\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.089408 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zxtk5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.089804 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h4k8\" (UniqueName: \"kubernetes.io/projected/c4cae306-d133-4f6b-b5f7-c86a8cf6fd11-kube-api-access-2h4k8\") pod \"multus-additional-cni-plugins-wt7n5\" (UID: \"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\") " pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.094375 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.094623 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfmb6\" (UniqueName: \"kubernetes.io/projected/8a911963-1d06-47d0-8f70-d81d5bd47496-kube-api-access-nfmb6\") pod \"network-metrics-daemon-c2klk\" (UID: \"8a911963-1d06-47d0-8f70-d81d5bd47496\") " pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.097580 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tpkv\" (UniqueName: \"kubernetes.io/projected/c1067fc5-9bff-4a81-982f-b2cca1c432d0-kube-api-access-4tpkv\") pod \"multus-wvbd7\" (UID: \"c1067fc5-9bff-4a81-982f-b2cca1c432d0\") " pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.098665 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.098690 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.098698 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.098731 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.098740 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:48Z","lastTransitionTime":"2026-01-30T21:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.101682 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.115952 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: W0130 21:14:48.118964 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e82ab6e_8068_438b_9caa_f3d7028cbb5f.slice/crio-ed2f3f9047eb1a4c83bb054ec30901f45cc14a0b4fea001d85a15e622c0fd277 WatchSource:0}: Error finding container ed2f3f9047eb1a4c83bb054ec30901f45cc14a0b4fea001d85a15e622c0fd277: Status 404 returned error can't find the container with id ed2f3f9047eb1a4c83bb054ec30901f45cc14a0b4fea001d85a15e622c0fd277 Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.123486 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: W0130 21:14:48.133495 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3be0c366_7d83_42e6_9a85_3f77ce72281f.slice/crio-65ea0621b874d02dc6dc5de4c672c45dc2a4900cd1b57e5e732631c929dc3b51 WatchSource:0}: Error finding container 65ea0621b874d02dc6dc5de4c672c45dc2a4900cd1b57e5e732631c929dc3b51: Status 404 returned error can't find the container with id 65ea0621b874d02dc6dc5de4c672c45dc2a4900cd1b57e5e732631c929dc3b51 Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.176686 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.190983 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.191958 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.198824 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.201672 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.201732 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.201749 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.201772 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.201785 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:48Z","lastTransitionTime":"2026-01-30T21:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.208395 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.224389 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wvbd7" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.226589 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.241215 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.241435 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:48 crc kubenswrapper[4914]: W0130 21:14:48.242762 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1067fc5_9bff_4a81_982f_b2cca1c432d0.slice/crio-8b00adfbe51e3a6af66fb536094f47e2581ca38c74f95514469808b6cfa7f2d8 WatchSource:0}: Error finding container 8b00adfbe51e3a6af66fb536094f47e2581ca38c74f95514469808b6cfa7f2d8: Status 404 returned error can't find the container with id 8b00adfbe51e3a6af66fb536094f47e2581ca38c74f95514469808b6cfa7f2d8 Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.249192 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.258653 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.259177 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 30 21:14:48 crc kubenswrapper[4914]: W0130 21:14:48.263755 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4cae306_d133_4f6b_b5f7_c86a8cf6fd11.slice/crio-287276be54592e5fd435a4629b4d6da8ae9af71ab6a1609b0ad2fd4c29e55c7e WatchSource:0}: Error finding container 287276be54592e5fd435a4629b4d6da8ae9af71ab6a1609b0ad2fd4c29e55c7e: Status 404 returned error can't find the container with id 287276be54592e5fd435a4629b4d6da8ae9af71ab6a1609b0ad2fd4c29e55c7e Jan 30 21:14:48 crc kubenswrapper[4914]: W0130 21:14:48.268343 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a32fa1f_f3a9_4e60_b665_51138c3ce768.slice/crio-0e42913ac7520e395fbd2e1ca6c66a93240d33e4f47675aea6fe9241c8efc3aa WatchSource:0}: Error finding container 0e42913ac7520e395fbd2e1ca6c66a93240d33e4f47675aea6fe9241c8efc3aa: Status 404 returned error can't find the container with id 0e42913ac7520e395fbd2e1ca6c66a93240d33e4f47675aea6fe9241c8efc3aa Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.293698 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.307801 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.307825 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.307835 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.307854 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.307864 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:48Z","lastTransitionTime":"2026-01-30T21:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.324060 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.354934 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.370144 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.375717 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.375808 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.375832 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.375875 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.375941 4914 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.375986 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:49.375974365 +0000 UTC m=+22.814611126 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.376303 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:14:49.376294903 +0000 UTC m=+22.814931664 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.376362 4914 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.376384 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:49.376378215 +0000 UTC m=+22.815014966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.376433 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.376447 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.376457 4914 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.376477 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:49.376471747 +0000 UTC m=+22.815108498 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.389668 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.401014 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.410400 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.410426 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.410454 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.410471 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.410480 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:48Z","lastTransitionTime":"2026-01-30T21:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.412379 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.428432 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.430182 4914 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-30 21:09:47 +0000 UTC, rotation deadline is 2026-11-08 10:50:51.660378491 +0000 UTC Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.430231 4914 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6757h36m3.230149385s for next certificate rotation Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.447158 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.476544 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.476836 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.476890 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.476907 4914 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.476978 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:49.476955221 +0000 UTC m=+22.915591982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.485291 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.513619 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.513658 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.513670 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.513685 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.513695 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:48Z","lastTransitionTime":"2026-01-30T21:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.530880 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.566349 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.578195 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs\") pod \"network-metrics-daemon-c2klk\" (UID: \"8a911963-1d06-47d0-8f70-d81d5bd47496\") " pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.578338 4914 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.578402 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs podName:8a911963-1d06-47d0-8f70-d81d5bd47496 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:49.578384807 +0000 UTC m=+23.017021578 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs") pod "network-metrics-daemon-c2klk" (UID: "8a911963-1d06-47d0-8f70-d81d5bd47496") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.604134 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.616323 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.616359 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.616371 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.616406 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.616417 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:48Z","lastTransitionTime":"2026-01-30T21:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.646972 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.685697 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.718980 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.719036 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.719052 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.719076 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.719092 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:48Z","lastTransitionTime":"2026-01-30T21:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.731944 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.758608 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 13:02:48.006249375 +0000 UTC Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.775469 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.801208 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.817839 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:14:48 crc kubenswrapper[4914]: E0130 21:14:48.817986 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.821551 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.821612 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.821625 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.821650 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.821664 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:48Z","lastTransitionTime":"2026-01-30T21:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.844820 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.882610 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.924857 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.924897 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.924909 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.924926 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.924938 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:48Z","lastTransitionTime":"2026-01-30T21:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.929963 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.959195 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" event={"ID":"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11","Type":"ContainerStarted","Data":"58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.959267 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" event={"ID":"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11","Type":"ContainerStarted","Data":"287276be54592e5fd435a4629b4d6da8ae9af71ab6a1609b0ad2fd4c29e55c7e"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.961154 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.961197 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.961214 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1d23b8d1099b9a87d94107a2d357fb011c477cba6e0f63034f2e4d0293e4dd54"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.962975 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zxtk5" event={"ID":"8e82ab6e-8068-438b-9caa-f3d7028cbb5f","Type":"ContainerStarted","Data":"d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.963005 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zxtk5" event={"ID":"8e82ab6e-8068-438b-9caa-f3d7028cbb5f","Type":"ContainerStarted","Data":"ed2f3f9047eb1a4c83bb054ec30901f45cc14a0b4fea001d85a15e622c0fd277"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.964378 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"22e8cf92419a5928bfbef43d29e996e1a2d05d37f8527740ade68967b46a817e"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.966037 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.966085 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cc95e233278ebd81d2113277843a11c6b367ebf6eef09704df50cce50fb97e18"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.968938 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.973939 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.975404 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.976071 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.977946 4914 generic.go:334] "Generic (PLEG): container finished" podID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerID="1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18" exitCode=0 Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.978033 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerDied","Data":"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.978094 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerStarted","Data":"0e42913ac7520e395fbd2e1ca6c66a93240d33e4f47675aea6fe9241c8efc3aa"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.979177 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvbd7" event={"ID":"c1067fc5-9bff-4a81-982f-b2cca1c432d0","Type":"ContainerStarted","Data":"ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.979199 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvbd7" event={"ID":"c1067fc5-9bff-4a81-982f-b2cca1c432d0","Type":"ContainerStarted","Data":"8b00adfbe51e3a6af66fb536094f47e2581ca38c74f95514469808b6cfa7f2d8"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.982522 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.982550 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4"} Jan 30 21:14:48 crc kubenswrapper[4914]: I0130 21:14:48.982561 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"65ea0621b874d02dc6dc5de4c672c45dc2a4900cd1b57e5e732631c929dc3b51"} Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.011045 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.026909 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.026957 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.026969 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.026988 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.027001 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:49Z","lastTransitionTime":"2026-01-30T21:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.029431 4914 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.067451 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.128526 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.128565 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.128576 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.128592 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.128601 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:49Z","lastTransitionTime":"2026-01-30T21:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.132001 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.146253 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.223769 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.231146 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.231184 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.231208 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.231227 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.231236 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:49Z","lastTransitionTime":"2026-01-30T21:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.245944 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.268015 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.308273 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.333919 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.333971 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.333987 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.334008 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.334021 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:49Z","lastTransitionTime":"2026-01-30T21:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.349692 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.386592 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.386770 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:14:51.386752057 +0000 UTC m=+24.825388818 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.386843 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.386897 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.386925 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.386983 4914 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.387072 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:51.387044724 +0000 UTC m=+24.825681485 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.387079 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.387096 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.387108 4914 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.387102 4914 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.387148 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:51.387141086 +0000 UTC m=+24.825777847 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.387208 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:51.387183337 +0000 UTC m=+24.825820118 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.388462 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.437408 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.437456 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.437466 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.437483 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.437495 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:49Z","lastTransitionTime":"2026-01-30T21:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.438844 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.477783 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.488293 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.488450 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.488468 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.488480 4914 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.488533 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:51.488518811 +0000 UTC m=+24.927155582 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.506906 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.539828 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.539898 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.539912 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.539935 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.539975 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:49Z","lastTransitionTime":"2026-01-30T21:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.548918 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.588327 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.589119 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs\") pod \"network-metrics-daemon-c2klk\" (UID: \"8a911963-1d06-47d0-8f70-d81d5bd47496\") " pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.589308 4914 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.589401 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs podName:8a911963-1d06-47d0-8f70-d81d5bd47496 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:51.589378314 +0000 UTC m=+25.028015075 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs") pod "network-metrics-daemon-c2klk" (UID: "8a911963-1d06-47d0-8f70-d81d5bd47496") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.632379 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.642803 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.642860 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.642871 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.642892 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.642904 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:49Z","lastTransitionTime":"2026-01-30T21:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.670287 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.707929 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.746296 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.746503 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.746546 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.746556 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.746573 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.746581 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:49Z","lastTransitionTime":"2026-01-30T21:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.758895 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 18:53:57.779835639 +0000 UTC Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.796947 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.817776 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.817812 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.817934 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.818021 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.818161 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:14:49 crc kubenswrapper[4914]: E0130 21:14:49.818259 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.821870 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.822614 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.823339 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.823988 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.824565 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.825072 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.825642 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.826198 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.826835 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.826865 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.827357 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.829548 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.830236 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.831129 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.831662 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.832208 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.833080 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.833618 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.834389 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.834937 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.835519 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.836374 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.836928 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.837348 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.838304 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.838769 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.839930 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.840539 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.841359 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.841945 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.843563 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.844221 4914 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.844322 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.846374 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.847695 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.848740 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.849114 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.849176 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.849189 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.849211 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.849227 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:49Z","lastTransitionTime":"2026-01-30T21:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.852217 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.854832 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.856057 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.864592 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.865745 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.867012 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.867901 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.869887 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.870771 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.870892 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.871967 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.872750 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.874059 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.875101 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.876432 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.877139 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.877833 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.879995 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.880811 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.882030 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.907206 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.950958 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.951402 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.951433 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.951443 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.951460 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.951468 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:49Z","lastTransitionTime":"2026-01-30T21:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.986104 4914 generic.go:334] "Generic (PLEG): container finished" podID="c4cae306-d133-4f6b-b5f7-c86a8cf6fd11" containerID="58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1" exitCode=0 Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.986170 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" event={"ID":"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11","Type":"ContainerDied","Data":"58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1"} Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.988694 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerStarted","Data":"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a"} Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.988748 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerStarted","Data":"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad"} Jan 30 21:14:49 crc kubenswrapper[4914]: I0130 21:14:49.988758 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerStarted","Data":"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75"} Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.023142 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.046312 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.059116 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.059155 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.059163 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.059180 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.059190 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:50Z","lastTransitionTime":"2026-01-30T21:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.072545 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.115991 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.143375 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.161825 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.161861 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.161872 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.161888 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.161898 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:50Z","lastTransitionTime":"2026-01-30T21:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.188582 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.226303 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.266758 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.266787 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.266797 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.266810 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.266820 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:50Z","lastTransitionTime":"2026-01-30T21:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.267293 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.314021 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.345032 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.371167 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.371207 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.371219 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.371260 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.371274 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:50Z","lastTransitionTime":"2026-01-30T21:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.394614 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.430361 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.455996 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7xn26"] Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.456400 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7xn26" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.473502 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.473533 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.473544 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.473561 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.473573 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:50Z","lastTransitionTime":"2026-01-30T21:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.474202 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.476374 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.496847 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c99cec6-435b-4912-b6e5-eb42cf23adfc-serviceca\") pod \"node-ca-7xn26\" (UID: \"5c99cec6-435b-4912-b6e5-eb42cf23adfc\") " pod="openshift-image-registry/node-ca-7xn26" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.496891 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j5xh\" (UniqueName: \"kubernetes.io/projected/5c99cec6-435b-4912-b6e5-eb42cf23adfc-kube-api-access-8j5xh\") pod \"node-ca-7xn26\" (UID: \"5c99cec6-435b-4912-b6e5-eb42cf23adfc\") " pod="openshift-image-registry/node-ca-7xn26" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.496958 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c99cec6-435b-4912-b6e5-eb42cf23adfc-host\") pod \"node-ca-7xn26\" (UID: \"5c99cec6-435b-4912-b6e5-eb42cf23adfc\") " pod="openshift-image-registry/node-ca-7xn26" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.497957 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.516227 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.536542 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.579375 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.579409 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.579422 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.579458 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.579468 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:50Z","lastTransitionTime":"2026-01-30T21:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.590850 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.597618 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c99cec6-435b-4912-b6e5-eb42cf23adfc-serviceca\") pod \"node-ca-7xn26\" (UID: \"5c99cec6-435b-4912-b6e5-eb42cf23adfc\") " pod="openshift-image-registry/node-ca-7xn26" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.597860 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j5xh\" (UniqueName: \"kubernetes.io/projected/5c99cec6-435b-4912-b6e5-eb42cf23adfc-kube-api-access-8j5xh\") pod \"node-ca-7xn26\" (UID: \"5c99cec6-435b-4912-b6e5-eb42cf23adfc\") " pod="openshift-image-registry/node-ca-7xn26" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.598017 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c99cec6-435b-4912-b6e5-eb42cf23adfc-host\") pod \"node-ca-7xn26\" (UID: \"5c99cec6-435b-4912-b6e5-eb42cf23adfc\") " pod="openshift-image-registry/node-ca-7xn26" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.598068 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c99cec6-435b-4912-b6e5-eb42cf23adfc-host\") pod \"node-ca-7xn26\" (UID: \"5c99cec6-435b-4912-b6e5-eb42cf23adfc\") " pod="openshift-image-registry/node-ca-7xn26" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.599385 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c99cec6-435b-4912-b6e5-eb42cf23adfc-serviceca\") pod \"node-ca-7xn26\" (UID: \"5c99cec6-435b-4912-b6e5-eb42cf23adfc\") " pod="openshift-image-registry/node-ca-7xn26" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.636472 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j5xh\" (UniqueName: \"kubernetes.io/projected/5c99cec6-435b-4912-b6e5-eb42cf23adfc-kube-api-access-8j5xh\") pod \"node-ca-7xn26\" (UID: \"5c99cec6-435b-4912-b6e5-eb42cf23adfc\") " pod="openshift-image-registry/node-ca-7xn26" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.646275 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.682083 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.682124 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.682137 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.682154 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.682166 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:50Z","lastTransitionTime":"2026-01-30T21:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.691544 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.724105 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.759546 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 18:14:13.544743537 +0000 UTC Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.767926 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7xn26" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.774926 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.785289 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.785346 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.785366 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.785391 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.785413 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:50Z","lastTransitionTime":"2026-01-30T21:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:50 crc kubenswrapper[4914]: W0130 21:14:50.788785 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c99cec6_435b_4912_b6e5_eb42cf23adfc.slice/crio-87854cff1ac99602aed9f2f8138c352a434ab87d1c29167a065971af1a2c8aa8 WatchSource:0}: Error finding container 87854cff1ac99602aed9f2f8138c352a434ab87d1c29167a065971af1a2c8aa8: Status 404 returned error can't find the container with id 87854cff1ac99602aed9f2f8138c352a434ab87d1c29167a065971af1a2c8aa8 Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.814600 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.817196 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:14:50 crc kubenswrapper[4914]: E0130 21:14:50.817346 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.852511 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.889494 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.889521 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.889799 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.889818 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.889827 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:50Z","lastTransitionTime":"2026-01-30T21:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.892582 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.925157 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.928793 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.928933 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.943870 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.985956 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:50Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.992976 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.993030 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.993048 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.993073 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.993090 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:50Z","lastTransitionTime":"2026-01-30T21:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.999822 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerStarted","Data":"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d"} Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.999887 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerStarted","Data":"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6"} Jan 30 21:14:50 crc kubenswrapper[4914]: I0130 21:14:50.999905 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerStarted","Data":"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e"} Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.002563 4914 generic.go:334] "Generic (PLEG): container finished" podID="c4cae306-d133-4f6b-b5f7-c86a8cf6fd11" containerID="ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16" exitCode=0 Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.002646 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" event={"ID":"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11","Type":"ContainerDied","Data":"ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16"} Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.004243 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8"} Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.007293 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7xn26" event={"ID":"5c99cec6-435b-4912-b6e5-eb42cf23adfc","Type":"ContainerStarted","Data":"87854cff1ac99602aed9f2f8138c352a434ab87d1c29167a065971af1a2c8aa8"} Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.030336 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.067947 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.096562 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.096605 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.096618 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.096637 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.096650 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:51Z","lastTransitionTime":"2026-01-30T21:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.107168 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.150331 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.190070 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.200996 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.201031 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.201040 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.201053 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.201062 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:51Z","lastTransitionTime":"2026-01-30T21:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.232928 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.268177 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.304689 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.304754 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.304771 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.304793 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.304807 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:51Z","lastTransitionTime":"2026-01-30T21:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.307961 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.359514 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.394463 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.405517 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.405644 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.405698 4914 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.405698 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:14:55.405676868 +0000 UTC m=+28.844313639 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.406008 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:55.405999656 +0000 UTC m=+28.844636417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.406022 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.406045 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.406121 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.406132 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.406141 4914 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.406162 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:55.406156489 +0000 UTC m=+28.844793250 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.406202 4914 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.406221 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:55.406215901 +0000 UTC m=+28.844852662 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.407098 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.407119 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.407129 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.407144 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.407155 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:51Z","lastTransitionTime":"2026-01-30T21:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.424810 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.464377 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.505382 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.506858 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.507086 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.507112 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.507126 4914 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.507177 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:55.507159576 +0000 UTC m=+28.945796347 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.509035 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.509064 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.509075 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.509090 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.509101 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:51Z","lastTransitionTime":"2026-01-30T21:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.552436 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.594323 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.607785 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs\") pod \"network-metrics-daemon-c2klk\" (UID: \"8a911963-1d06-47d0-8f70-d81d5bd47496\") " pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.607925 4914 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.607980 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs podName:8a911963-1d06-47d0-8f70-d81d5bd47496 nodeName:}" failed. No retries permitted until 2026-01-30 21:14:55.607964878 +0000 UTC m=+29.046601639 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs") pod "network-metrics-daemon-c2klk" (UID: "8a911963-1d06-47d0-8f70-d81d5bd47496") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.611441 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.611473 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.611481 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.611495 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.611505 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:51Z","lastTransitionTime":"2026-01-30T21:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.647587 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.687364 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.714806 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.715072 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.715109 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.715128 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.715152 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.715169 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:51Z","lastTransitionTime":"2026-01-30T21:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.748212 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.760871 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 07:47:32.123411797 +0000 UTC Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.791540 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.817197 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.817257 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.817292 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.817393 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.817550 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:14:51 crc kubenswrapper[4914]: E0130 21:14:51.817935 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.818606 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.818660 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.818681 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.818734 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.818753 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:51Z","lastTransitionTime":"2026-01-30T21:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.836984 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.867432 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.911822 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.920985 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.921054 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.921074 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.921107 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.921146 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:51Z","lastTransitionTime":"2026-01-30T21:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.949265 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:51 crc kubenswrapper[4914]: I0130 21:14:51.995059 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:51Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.013435 4914 generic.go:334] "Generic (PLEG): container finished" podID="c4cae306-d133-4f6b-b5f7-c86a8cf6fd11" containerID="024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d" exitCode=0 Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.013547 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" event={"ID":"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11","Type":"ContainerDied","Data":"024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d"} Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.015830 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7xn26" event={"ID":"5c99cec6-435b-4912-b6e5-eb42cf23adfc","Type":"ContainerStarted","Data":"014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1"} Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.023472 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.023513 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.023529 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.023556 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.023574 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:52Z","lastTransitionTime":"2026-01-30T21:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.039484 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.071463 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.111554 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.126251 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.126301 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.126317 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.126339 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.126355 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:52Z","lastTransitionTime":"2026-01-30T21:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.152284 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.185938 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.233434 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.233491 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.233483 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.233654 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.233765 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.233789 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:52Z","lastTransitionTime":"2026-01-30T21:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.281125 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.315661 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.336978 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.337035 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.337052 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.337075 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.337091 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:52Z","lastTransitionTime":"2026-01-30T21:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.354377 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.392507 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.429328 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.444526 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.444556 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.444565 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.444582 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.444595 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:52Z","lastTransitionTime":"2026-01-30T21:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.473145 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.516136 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.547637 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.548345 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.548413 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.548437 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.548466 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.548488 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:52Z","lastTransitionTime":"2026-01-30T21:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.593268 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.627109 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.651144 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.651384 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.651534 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.651698 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.651882 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:52Z","lastTransitionTime":"2026-01-30T21:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.668265 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.708182 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.749138 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.755008 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.755051 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.755069 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.755090 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.755100 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:52Z","lastTransitionTime":"2026-01-30T21:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.761431 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 11:36:25.309973625 +0000 UTC Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.787957 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.817261 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:14:52 crc kubenswrapper[4914]: E0130 21:14:52.817470 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.829336 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.858046 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.858121 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.858144 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.858165 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.858180 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:52Z","lastTransitionTime":"2026-01-30T21:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.870100 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.906832 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.946044 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.960888 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.960940 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.960951 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.960986 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.960997 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:52Z","lastTransitionTime":"2026-01-30T21:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:52 crc kubenswrapper[4914]: I0130 21:14:52.986989 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:52Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.022809 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerStarted","Data":"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412"} Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.025079 4914 generic.go:334] "Generic (PLEG): container finished" podID="c4cae306-d133-4f6b-b5f7-c86a8cf6fd11" containerID="9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0" exitCode=0 Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.025450 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" event={"ID":"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11","Type":"ContainerDied","Data":"9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0"} Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.049034 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.064059 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.064112 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.064134 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.064162 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.064184 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:53Z","lastTransitionTime":"2026-01-30T21:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.070920 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.111723 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.152798 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.168563 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.168600 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.168613 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.168630 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.168643 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:53Z","lastTransitionTime":"2026-01-30T21:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.189551 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.231474 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.267870 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.273231 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.273298 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.273317 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.273344 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.273362 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:53Z","lastTransitionTime":"2026-01-30T21:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.309959 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.345171 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.376865 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.376903 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.376912 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.376932 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.376942 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:53Z","lastTransitionTime":"2026-01-30T21:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.391633 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.429954 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.465097 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.480536 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.480610 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.480628 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.480653 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.480672 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:53Z","lastTransitionTime":"2026-01-30T21:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.506997 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.547855 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.583479 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.583550 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.583576 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.583607 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.583635 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:53Z","lastTransitionTime":"2026-01-30T21:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.589145 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.632073 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.681617 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.685843 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.686008 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.686103 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.686188 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.686265 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:53Z","lastTransitionTime":"2026-01-30T21:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.722668 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.751597 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.762399 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 05:38:36.215642462 +0000 UTC Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.788651 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.788841 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.788927 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.789016 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.789094 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:53Z","lastTransitionTime":"2026-01-30T21:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.792375 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.817778 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.817809 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.817789 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:14:53 crc kubenswrapper[4914]: E0130 21:14:53.817973 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:14:53 crc kubenswrapper[4914]: E0130 21:14:53.818104 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:14:53 crc kubenswrapper[4914]: E0130 21:14:53.818321 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.830473 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.875782 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.891530 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.891780 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.892071 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.892174 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.892265 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:53Z","lastTransitionTime":"2026-01-30T21:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.909598 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:53Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.995232 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.995300 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.995359 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.995390 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:53 crc kubenswrapper[4914]: I0130 21:14:53.995414 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:53Z","lastTransitionTime":"2026-01-30T21:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.032265 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" event={"ID":"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11","Type":"ContainerStarted","Data":"c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459"} Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.059101 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.077636 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.098498 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.098551 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.098566 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.098590 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.098606 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:54Z","lastTransitionTime":"2026-01-30T21:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.102673 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.121312 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.144910 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.165428 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.189026 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.202033 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.202107 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.202126 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.202153 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.202174 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:54Z","lastTransitionTime":"2026-01-30T21:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.232781 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.273007 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.304457 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.304510 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.304529 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.304553 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.304570 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:54Z","lastTransitionTime":"2026-01-30T21:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.315761 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.351656 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.400868 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.407610 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.407671 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.407694 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.407758 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.407781 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:54Z","lastTransitionTime":"2026-01-30T21:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.445000 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.474685 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.510818 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.510884 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.510901 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.510926 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.510945 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:54Z","lastTransitionTime":"2026-01-30T21:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.518019 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.550510 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:54Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.614125 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.614476 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.614623 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.614812 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.614969 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:54Z","lastTransitionTime":"2026-01-30T21:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.717520 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.717819 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.717845 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.717878 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.717900 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:54Z","lastTransitionTime":"2026-01-30T21:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.763639 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 13:31:23.906582793 +0000 UTC Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.817450 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:14:54 crc kubenswrapper[4914]: E0130 21:14:54.817863 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.820516 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.820588 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.820613 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.820643 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.820665 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:54Z","lastTransitionTime":"2026-01-30T21:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.923961 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.924018 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.924036 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.924061 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:54 crc kubenswrapper[4914]: I0130 21:14:54.924080 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:54Z","lastTransitionTime":"2026-01-30T21:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.028328 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.028384 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.028401 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.028425 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.028441 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:55Z","lastTransitionTime":"2026-01-30T21:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.044998 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerStarted","Data":"5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668"} Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.045374 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.052142 4914 generic.go:334] "Generic (PLEG): container finished" podID="c4cae306-d133-4f6b-b5f7-c86a8cf6fd11" containerID="c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459" exitCode=0 Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.052219 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" event={"ID":"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11","Type":"ContainerDied","Data":"c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459"} Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.069244 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.088887 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.094776 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.108811 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.126364 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.134485 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.134535 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.134558 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.134588 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.134609 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:55Z","lastTransitionTime":"2026-01-30T21:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.162077 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.193032 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.217897 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.233368 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.237039 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.237079 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.237096 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.237117 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.237131 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:55Z","lastTransitionTime":"2026-01-30T21:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.246209 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.261081 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.273890 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.291240 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.304037 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.322383 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.337428 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.339783 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.339850 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.339876 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.339909 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.339934 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:55Z","lastTransitionTime":"2026-01-30T21:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.358079 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.370040 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.383622 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.402950 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.418371 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.442838 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.442916 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.442933 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.442963 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.442989 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:55Z","lastTransitionTime":"2026-01-30T21:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.448261 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.448414 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.448462 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.448551 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.448672 4914 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.448780 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:03.448758344 +0000 UTC m=+36.887395145 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.449285 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:15:03.449266785 +0000 UTC m=+36.887903586 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.449405 4914 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.449462 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:03.44944789 +0000 UTC m=+36.888084691 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.449561 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.449592 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.449613 4914 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.449657 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:03.449644014 +0000 UTC m=+36.888280815 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.450082 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.471681 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.486002 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.519662 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.544928 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.544968 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.544977 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.544992 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.545001 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:55Z","lastTransitionTime":"2026-01-30T21:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.549688 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.549872 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.549904 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.549917 4914 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.549979 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:03.549961975 +0000 UTC m=+36.988598736 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.552484 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.591359 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.627057 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.647933 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.647986 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.648001 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.648021 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.648036 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:55Z","lastTransitionTime":"2026-01-30T21:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.650787 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs\") pod \"network-metrics-daemon-c2klk\" (UID: \"8a911963-1d06-47d0-8f70-d81d5bd47496\") " pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.650986 4914 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.651075 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs podName:8a911963-1d06-47d0-8f70-d81d5bd47496 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:03.651049953 +0000 UTC m=+37.089686814 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs") pod "network-metrics-daemon-c2klk" (UID: "8a911963-1d06-47d0-8f70-d81d5bd47496") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.668673 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.707822 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.749400 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.750320 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.750390 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.750405 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.750429 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.750443 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:55Z","lastTransitionTime":"2026-01-30T21:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.764370 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 10:30:59.272548711 +0000 UTC Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.790677 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.817393 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.817487 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.817531 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.817572 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.817701 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:14:55 crc kubenswrapper[4914]: E0130 21:14:55.817821 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.830489 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:55Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.853460 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.853493 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.853503 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.853517 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.853527 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:55Z","lastTransitionTime":"2026-01-30T21:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.956612 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.956672 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.956690 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.956737 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:55 crc kubenswrapper[4914]: I0130 21:14:55.956755 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:55Z","lastTransitionTime":"2026-01-30T21:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.059621 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.059663 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.059675 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.059694 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.059723 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:56Z","lastTransitionTime":"2026-01-30T21:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.060937 4914 generic.go:334] "Generic (PLEG): container finished" podID="c4cae306-d133-4f6b-b5f7-c86a8cf6fd11" containerID="0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699" exitCode=0 Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.061191 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" event={"ID":"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11","Type":"ContainerDied","Data":"0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699"} Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.063471 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.063534 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.077829 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.087999 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.089442 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.107063 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.120405 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.133887 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.147510 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.161588 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.162935 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.163206 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.163220 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.163242 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.163256 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:56Z","lastTransitionTime":"2026-01-30T21:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.179725 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.199025 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.244835 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.266965 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.267010 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.267025 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.267046 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.267061 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:56Z","lastTransitionTime":"2026-01-30T21:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.270075 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.312024 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.349855 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.369589 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.369636 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.369651 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.369669 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.369682 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:56Z","lastTransitionTime":"2026-01-30T21:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.391625 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.427942 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.473286 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.474009 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.474272 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.474288 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.474337 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.474352 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:56Z","lastTransitionTime":"2026-01-30T21:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.509722 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.549742 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.579207 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.579324 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.579349 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.579395 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.579422 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:56Z","lastTransitionTime":"2026-01-30T21:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.600145 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.629523 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.672520 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.682524 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.682574 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.682593 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.682618 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.682636 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:56Z","lastTransitionTime":"2026-01-30T21:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.709930 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.765522 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 01:50:21.105948573 +0000 UTC Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.767384 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.789572 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.789632 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.789666 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.789692 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.789736 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:56Z","lastTransitionTime":"2026-01-30T21:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.799695 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.818171 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:14:56 crc kubenswrapper[4914]: E0130 21:14:56.818371 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.836978 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.872398 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.893247 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.893286 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.893298 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.893353 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.893375 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:56Z","lastTransitionTime":"2026-01-30T21:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.908122 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.961498 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.996422 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.996480 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.996503 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.996530 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:56 crc kubenswrapper[4914]: I0130 21:14:56.996561 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:56Z","lastTransitionTime":"2026-01-30T21:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.001946 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.030060 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.041727 4914 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.080141 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" event={"ID":"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11","Type":"ContainerStarted","Data":"74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3"} Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.085476 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.098944 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.098993 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.099006 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.099025 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.099039 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:57Z","lastTransitionTime":"2026-01-30T21:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.108584 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.169136 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.189859 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.202186 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.202236 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.202254 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.202278 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.202296 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:57Z","lastTransitionTime":"2026-01-30T21:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.235586 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.274148 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.305780 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.306157 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.306267 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.306375 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.306512 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:57Z","lastTransitionTime":"2026-01-30T21:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.310175 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.350397 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.398117 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.410173 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.410220 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.410231 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.410251 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.410266 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:57Z","lastTransitionTime":"2026-01-30T21:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.435456 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.465424 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.508729 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.508791 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.508801 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.508822 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.508838 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:57Z","lastTransitionTime":"2026-01-30T21:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.516417 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: E0130 21:14:57.530443 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.535394 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.535466 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.535494 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.535522 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.535554 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:57Z","lastTransitionTime":"2026-01-30T21:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:57 crc kubenswrapper[4914]: E0130 21:14:57.552891 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.553923 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.558850 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.558893 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.558903 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.558922 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.558937 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:57Z","lastTransitionTime":"2026-01-30T21:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:57 crc kubenswrapper[4914]: E0130 21:14:57.589263 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.594145 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.594181 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.594190 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.594205 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.594217 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:57Z","lastTransitionTime":"2026-01-30T21:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.599092 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: E0130 21:14:57.614903 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.623138 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.623288 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.623372 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.623464 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.623542 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:57Z","lastTransitionTime":"2026-01-30T21:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.631811 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: E0130 21:14:57.655269 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: E0130 21:14:57.655491 4914 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.657119 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.657170 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.657185 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.657206 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.657222 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:57Z","lastTransitionTime":"2026-01-30T21:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.669288 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.704746 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.742948 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.760009 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.760069 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.760081 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.760096 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.760106 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:57Z","lastTransitionTime":"2026-01-30T21:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.765882 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 23:06:15.644392962 +0000 UTC Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.817468 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.817519 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.817611 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:57 crc kubenswrapper[4914]: E0130 21:14:57.817659 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:14:57 crc kubenswrapper[4914]: E0130 21:14:57.817819 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:14:57 crc kubenswrapper[4914]: E0130 21:14:57.817952 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.832526 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.845420 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.862657 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.862752 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.862770 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.862789 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.862799 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:57Z","lastTransitionTime":"2026-01-30T21:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.864968 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.907563 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.950695 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.965311 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.965350 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.965360 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.965376 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.965431 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:57Z","lastTransitionTime":"2026-01-30T21:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:57 crc kubenswrapper[4914]: I0130 21:14:57.993553 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:57Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.029587 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.065784 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.067150 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.067188 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.067201 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.067218 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.067231 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:58Z","lastTransitionTime":"2026-01-30T21:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.108687 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.153738 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.169228 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.169279 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.169297 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.169322 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.169341 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:58Z","lastTransitionTime":"2026-01-30T21:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.185700 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.241158 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.272305 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.273589 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.273618 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.273629 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.273645 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.273656 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:58Z","lastTransitionTime":"2026-01-30T21:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.304755 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.351356 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.376437 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.376466 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.376474 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.376487 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.376497 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:58Z","lastTransitionTime":"2026-01-30T21:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.385280 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.479072 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.479108 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.479119 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.479136 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.479149 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:58Z","lastTransitionTime":"2026-01-30T21:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.581403 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.581434 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.581442 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.581455 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.581464 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:58Z","lastTransitionTime":"2026-01-30T21:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.684521 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.684556 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.684565 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.684578 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.684586 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:58Z","lastTransitionTime":"2026-01-30T21:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.766461 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 17:40:58.824891587 +0000 UTC Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.787194 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.787241 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.787253 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.787286 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.787299 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:58Z","lastTransitionTime":"2026-01-30T21:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.817612 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:14:58 crc kubenswrapper[4914]: E0130 21:14:58.817805 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.889881 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.889925 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.889938 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.889954 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.889966 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:58Z","lastTransitionTime":"2026-01-30T21:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.992849 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.992909 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.992926 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.992947 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:58 crc kubenswrapper[4914]: I0130 21:14:58.992964 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:58Z","lastTransitionTime":"2026-01-30T21:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.090999 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/0.log" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.094102 4914 generic.go:334] "Generic (PLEG): container finished" podID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerID="5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668" exitCode=1 Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.094159 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerDied","Data":"5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668"} Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.094937 4914 scope.go:117] "RemoveContainer" containerID="5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.095173 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.095221 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.095240 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.095262 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.095280 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:59Z","lastTransitionTime":"2026-01-30T21:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.119003 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.137427 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.153520 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.169258 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.190276 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.198859 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.198994 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.199020 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.199049 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.199068 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:59Z","lastTransitionTime":"2026-01-30T21:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.213667 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.242963 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:14:58Z\\\",\\\"message\\\":\\\"Removed *v1.Namespace event handler 5\\\\nI0130 21:14:58.263504 6197 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 21:14:58.263540 6197 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:14:58.263571 6197 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:14:58.263697 6197 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:14:58.264143 6197 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:14:58.264206 6197 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.264422 6197 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.264618 6197 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.265068 6197 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 21:14:58.265103 6197 factory.go:656] Stopping watch factory\\\\nI0130 21:14:58.265120 6197 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21:14:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.269059 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.284873 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.302980 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.303179 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.303351 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.303507 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.303596 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:59Z","lastTransitionTime":"2026-01-30T21:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.307909 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.321003 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.340207 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.356633 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.376321 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.392513 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.404583 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:14:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.406600 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.406748 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.406839 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.406925 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.407006 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:59Z","lastTransitionTime":"2026-01-30T21:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.510928 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.510966 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.510978 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.510995 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.511006 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:59Z","lastTransitionTime":"2026-01-30T21:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.613842 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.613884 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.613898 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.613920 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.613931 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:59Z","lastTransitionTime":"2026-01-30T21:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.717348 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.717426 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.717449 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.717480 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.717502 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:59Z","lastTransitionTime":"2026-01-30T21:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.766952 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 08:44:23.4370022 +0000 UTC Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.817650 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:14:59 crc kubenswrapper[4914]: E0130 21:14:59.818104 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.817649 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.817650 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:14:59 crc kubenswrapper[4914]: E0130 21:14:59.818234 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:14:59 crc kubenswrapper[4914]: E0130 21:14:59.818809 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.820110 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.820149 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.820161 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.820180 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.820193 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:59Z","lastTransitionTime":"2026-01-30T21:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.922444 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.922478 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.922489 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.922504 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:14:59 crc kubenswrapper[4914]: I0130 21:14:59.922515 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:14:59Z","lastTransitionTime":"2026-01-30T21:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.024583 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.024647 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.024660 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.024676 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.024688 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:00Z","lastTransitionTime":"2026-01-30T21:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.128255 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.128327 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.128352 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.128382 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.128404 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:00Z","lastTransitionTime":"2026-01-30T21:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.230990 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.231017 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.231029 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.231045 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.231059 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:00Z","lastTransitionTime":"2026-01-30T21:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.333818 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.333887 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.333907 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.333932 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.333950 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:00Z","lastTransitionTime":"2026-01-30T21:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.436319 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.436381 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.436397 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.436417 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.436429 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:00Z","lastTransitionTime":"2026-01-30T21:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.539494 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.539532 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.539544 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.539559 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.539570 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:00Z","lastTransitionTime":"2026-01-30T21:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.642123 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.642188 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.642213 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.642244 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.642267 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:00Z","lastTransitionTime":"2026-01-30T21:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.745334 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.745380 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.745394 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.745414 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.745426 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:00Z","lastTransitionTime":"2026-01-30T21:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.767904 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 07:27:35.077039256 +0000 UTC Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.817387 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:00 crc kubenswrapper[4914]: E0130 21:15:00.817523 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.847200 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.847241 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.847252 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.847270 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.847284 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:00Z","lastTransitionTime":"2026-01-30T21:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.949599 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.949652 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.949670 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.949694 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:00 crc kubenswrapper[4914]: I0130 21:15:00.949737 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:00Z","lastTransitionTime":"2026-01-30T21:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.052828 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.052888 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.052906 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.052933 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.052953 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:01Z","lastTransitionTime":"2026-01-30T21:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.103067 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/0.log" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.106054 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerStarted","Data":"613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb"} Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.106719 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.131122 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.153522 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.155267 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.155477 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.155662 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.155939 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.156126 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:01Z","lastTransitionTime":"2026-01-30T21:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.163356 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv"] Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.163770 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.165241 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.166655 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.183933 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.200541 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.205475 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c287caa9-36a4-4d1f-9799-0fda91a8c8d2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z2dvv\" (UID: \"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.205532 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c287caa9-36a4-4d1f-9799-0fda91a8c8d2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z2dvv\" (UID: \"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.205560 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flg8f\" (UniqueName: \"kubernetes.io/projected/c287caa9-36a4-4d1f-9799-0fda91a8c8d2-kube-api-access-flg8f\") pod \"ovnkube-control-plane-749d76644c-z2dvv\" (UID: \"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.205958 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c287caa9-36a4-4d1f-9799-0fda91a8c8d2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z2dvv\" (UID: \"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.211405 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.232696 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:14:58Z\\\",\\\"message\\\":\\\"Removed *v1.Namespace event handler 5\\\\nI0130 21:14:58.263504 6197 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 21:14:58.263540 6197 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:14:58.263571 6197 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:14:58.263697 6197 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:14:58.264143 6197 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:14:58.264206 6197 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.264422 6197 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.264618 6197 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.265068 6197 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 21:14:58.265103 6197 factory.go:656] Stopping watch factory\\\\nI0130 21:14:58.265120 6197 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21:14:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.259253 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.259579 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.259787 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.259951 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.260809 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:01Z","lastTransitionTime":"2026-01-30T21:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.270379 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.285276 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.302008 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.306772 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c287caa9-36a4-4d1f-9799-0fda91a8c8d2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z2dvv\" (UID: \"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.306830 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c287caa9-36a4-4d1f-9799-0fda91a8c8d2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z2dvv\" (UID: \"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.306862 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flg8f\" (UniqueName: \"kubernetes.io/projected/c287caa9-36a4-4d1f-9799-0fda91a8c8d2-kube-api-access-flg8f\") pod \"ovnkube-control-plane-749d76644c-z2dvv\" (UID: \"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.306896 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c287caa9-36a4-4d1f-9799-0fda91a8c8d2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z2dvv\" (UID: \"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.307444 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c287caa9-36a4-4d1f-9799-0fda91a8c8d2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z2dvv\" (UID: \"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.307934 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c287caa9-36a4-4d1f-9799-0fda91a8c8d2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z2dvv\" (UID: \"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.313195 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c287caa9-36a4-4d1f-9799-0fda91a8c8d2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z2dvv\" (UID: \"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.318112 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.349659 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flg8f\" (UniqueName: \"kubernetes.io/projected/c287caa9-36a4-4d1f-9799-0fda91a8c8d2-kube-api-access-flg8f\") pod \"ovnkube-control-plane-749d76644c-z2dvv\" (UID: \"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.363324 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.363635 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.363744 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.363854 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.363938 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:01Z","lastTransitionTime":"2026-01-30T21:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.367520 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.396817 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.419346 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.439569 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.453453 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.464378 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.465963 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.466007 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.466016 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.466031 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.466042 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:01Z","lastTransitionTime":"2026-01-30T21:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.473725 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.484014 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.487835 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.511797 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.528089 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.583692 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.585561 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.585596 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.585606 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.585623 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.585634 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:01Z","lastTransitionTime":"2026-01-30T21:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.594744 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.606736 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.615814 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.628475 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.640488 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.653208 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.664449 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.677745 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.687901 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.687962 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.687976 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.687996 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.688009 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:01Z","lastTransitionTime":"2026-01-30T21:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.694641 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:14:58Z\\\",\\\"message\\\":\\\"Removed *v1.Namespace event handler 5\\\\nI0130 21:14:58.263504 6197 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 21:14:58.263540 6197 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:14:58.263571 6197 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:14:58.263697 6197 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:14:58.264143 6197 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:14:58.264206 6197 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.264422 6197 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.264618 6197 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.265068 6197 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 21:14:58.265103 6197 factory.go:656] Stopping watch factory\\\\nI0130 21:14:58.265120 6197 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21:14:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.713236 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.724455 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.738726 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:01Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.768333 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 11:25:50.640005425 +0000 UTC Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.790480 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.790545 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.790566 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.790596 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.790618 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:01Z","lastTransitionTime":"2026-01-30T21:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.817644 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.817741 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:01 crc kubenswrapper[4914]: E0130 21:15:01.817876 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.817903 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:01 crc kubenswrapper[4914]: E0130 21:15:01.818070 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:01 crc kubenswrapper[4914]: E0130 21:15:01.818285 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.893697 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.893764 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.893774 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.893793 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.893807 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:01Z","lastTransitionTime":"2026-01-30T21:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.997092 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.997152 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.997164 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.997189 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:01 crc kubenswrapper[4914]: I0130 21:15:01.997400 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:01Z","lastTransitionTime":"2026-01-30T21:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.100786 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.100844 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.100855 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.100880 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.100894 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:02Z","lastTransitionTime":"2026-01-30T21:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.113350 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/1.log" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.114428 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/0.log" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.118070 4914 generic.go:334] "Generic (PLEG): container finished" podID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerID="613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb" exitCode=1 Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.118154 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerDied","Data":"613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb"} Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.118219 4914 scope.go:117] "RemoveContainer" containerID="5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.124481 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" event={"ID":"c287caa9-36a4-4d1f-9799-0fda91a8c8d2","Type":"ContainerStarted","Data":"f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf"} Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.124537 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" event={"ID":"c287caa9-36a4-4d1f-9799-0fda91a8c8d2","Type":"ContainerStarted","Data":"6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8"} Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.124552 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" event={"ID":"c287caa9-36a4-4d1f-9799-0fda91a8c8d2","Type":"ContainerStarted","Data":"ce19625fa60f51f6daa2ca1c642141b23d3551dc42ccea19551a530d61ac5aa5"} Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.126147 4914 scope.go:117] "RemoveContainer" containerID="613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb" Jan 30 21:15:02 crc kubenswrapper[4914]: E0130 21:15:02.126545 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.139012 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.154252 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.172851 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.191992 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.204619 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.204740 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.204761 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.204787 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.204806 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:02Z","lastTransitionTime":"2026-01-30T21:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.206333 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.225525 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.235862 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.247641 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.261625 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.281028 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:14:58Z\\\",\\\"message\\\":\\\"Removed *v1.Namespace event handler 5\\\\nI0130 21:14:58.263504 6197 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 21:14:58.263540 6197 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:14:58.263571 6197 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:14:58.263697 6197 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:14:58.264143 6197 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:14:58.264206 6197 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.264422 6197 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.264618 6197 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.265068 6197 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 21:14:58.265103 6197 factory.go:656] Stopping watch factory\\\\nI0130 21:14:58.265120 6197 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21:14:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"message\\\":\\\"retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244547 6361 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244554 6361 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0130 21:15:01.244553 6361 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nI0130 21:15:01.244429 6361 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-wt7n5 after 0 failed attempt(s)\\\\nF0130 21:15:01.244565 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.301817 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.307008 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.307075 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.307100 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.307135 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.307160 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:02Z","lastTransitionTime":"2026-01-30T21:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.317648 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.335889 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.349342 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.362577 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.372956 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.383488 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.398171 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.410625 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.410673 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.410685 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.410717 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.410729 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:02Z","lastTransitionTime":"2026-01-30T21:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.412676 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.424591 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.436375 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.449596 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.479888 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa8aac6857513bdc6df3843e6418ce85ffafbbc21aa1ed90ca9cf5e7937e668\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:14:58Z\\\",\\\"message\\\":\\\"Removed *v1.Namespace event handler 5\\\\nI0130 21:14:58.263504 6197 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 21:14:58.263540 6197 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:14:58.263571 6197 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:14:58.263697 6197 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:14:58.264143 6197 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:14:58.264206 6197 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.264422 6197 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.264618 6197 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:14:58.265068 6197 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 21:14:58.265103 6197 factory.go:656] Stopping watch factory\\\\nI0130 21:14:58.265120 6197 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21:14:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"message\\\":\\\"retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244547 6361 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244554 6361 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0130 21:15:01.244553 6361 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nI0130 21:15:01.244429 6361 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-wt7n5 after 0 failed attempt(s)\\\\nF0130 21:15:01.244565 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.503389 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.513674 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.513746 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.513784 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.513808 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.513824 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:02Z","lastTransitionTime":"2026-01-30T21:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.516907 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.532807 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.550458 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.563172 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.577818 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.588065 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.600497 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.609269 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.620189 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.620439 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.620448 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.620468 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.620478 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:02Z","lastTransitionTime":"2026-01-30T21:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.626124 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.640823 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.722526 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.722574 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.722586 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.722605 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.722617 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:02Z","lastTransitionTime":"2026-01-30T21:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.768991 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 02:18:27.842140896 +0000 UTC Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.817759 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:02 crc kubenswrapper[4914]: E0130 21:15:02.817932 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.824490 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.824516 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.824525 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.824537 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.824546 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:02Z","lastTransitionTime":"2026-01-30T21:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.927032 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.927096 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.927114 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.927141 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:02 crc kubenswrapper[4914]: I0130 21:15:02.927160 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:02Z","lastTransitionTime":"2026-01-30T21:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.030080 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.030141 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.030157 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.030182 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.030200 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:03Z","lastTransitionTime":"2026-01-30T21:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.131402 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/1.log" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.133623 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.133745 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.133818 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.133894 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.133921 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:03Z","lastTransitionTime":"2026-01-30T21:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.137747 4914 scope.go:117] "RemoveContainer" containerID="613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb" Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.138010 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.156750 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.173522 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.193206 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.214813 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.237269 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.237321 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.237333 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.237351 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.237364 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:03Z","lastTransitionTime":"2026-01-30T21:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.246422 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"message\\\":\\\"retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244547 6361 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244554 6361 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0130 21:15:01.244553 6361 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nI0130 21:15:01.244429 6361 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-wt7n5 after 0 failed attempt(s)\\\\nF0130 21:15:01.244565 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.281447 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.302614 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.326468 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.339899 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.339953 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.339969 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.339992 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.340010 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:03Z","lastTransitionTime":"2026-01-30T21:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.340208 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.354064 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.369761 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.381052 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.395840 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.412169 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.426561 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.442418 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.442483 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.442510 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.442543 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.442571 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:03Z","lastTransitionTime":"2026-01-30T21:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.443334 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.457738 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.529569 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.529647 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.529677 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.529765 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.529815 4914 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.529826 4914 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.529857 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:15:19.529814885 +0000 UTC m=+52.968451686 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.529909 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:19.529890026 +0000 UTC m=+52.968526937 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.529938 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:19.529922637 +0000 UTC m=+52.968559518 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.530036 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.530099 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.530120 4914 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.530226 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:19.530191133 +0000 UTC m=+52.968827934 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.545071 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.545093 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.545121 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.545135 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.545144 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:03Z","lastTransitionTime":"2026-01-30T21:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.630625 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.630821 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.630842 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.630853 4914 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.630909 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:19.630891483 +0000 UTC m=+53.069528244 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.648196 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.648226 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.648235 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.648248 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.648258 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:03Z","lastTransitionTime":"2026-01-30T21:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.731418 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs\") pod \"network-metrics-daemon-c2klk\" (UID: \"8a911963-1d06-47d0-8f70-d81d5bd47496\") " pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.731611 4914 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.731754 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs podName:8a911963-1d06-47d0-8f70-d81d5bd47496 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:19.731730226 +0000 UTC m=+53.170367037 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs") pod "network-metrics-daemon-c2klk" (UID: "8a911963-1d06-47d0-8f70-d81d5bd47496") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.750972 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.751061 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.751084 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.751112 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.751133 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:03Z","lastTransitionTime":"2026-01-30T21:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.769992 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 20:44:33.601932846 +0000 UTC Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.817173 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.817253 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.817173 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.817317 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.817450 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:03 crc kubenswrapper[4914]: E0130 21:15:03.817548 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.853381 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.853426 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.853437 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.853453 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.853465 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:03Z","lastTransitionTime":"2026-01-30T21:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.956961 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.957039 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.957096 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.957133 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:03 crc kubenswrapper[4914]: I0130 21:15:03.957159 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:03Z","lastTransitionTime":"2026-01-30T21:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.060117 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.060180 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.060204 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.060439 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.060501 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:04Z","lastTransitionTime":"2026-01-30T21:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.166833 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.166895 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.166911 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.166998 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.167019 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:04Z","lastTransitionTime":"2026-01-30T21:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.270687 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.270781 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.270799 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.270823 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.270841 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:04Z","lastTransitionTime":"2026-01-30T21:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.373538 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.373597 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.373616 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.373642 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.373660 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:04Z","lastTransitionTime":"2026-01-30T21:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.476924 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.476987 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.477010 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.477038 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.477059 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:04Z","lastTransitionTime":"2026-01-30T21:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.580308 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.580386 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.580408 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.580437 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.580463 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:04Z","lastTransitionTime":"2026-01-30T21:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.683403 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.683464 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.683481 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.683511 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.683530 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:04Z","lastTransitionTime":"2026-01-30T21:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.770196 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 07:17:21.800723496 +0000 UTC Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.786938 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.787006 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.787024 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.787058 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.787077 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:04Z","lastTransitionTime":"2026-01-30T21:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.817659 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:04 crc kubenswrapper[4914]: E0130 21:15:04.817944 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.889989 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.890047 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.890058 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.890082 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.890101 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:04Z","lastTransitionTime":"2026-01-30T21:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.993417 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.993475 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.993492 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.993516 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:04 crc kubenswrapper[4914]: I0130 21:15:04.993534 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:04Z","lastTransitionTime":"2026-01-30T21:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.096899 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.096965 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.096991 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.097023 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.097047 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:05Z","lastTransitionTime":"2026-01-30T21:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.199346 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.199402 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.199420 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.199448 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.199472 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:05Z","lastTransitionTime":"2026-01-30T21:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.301670 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.301818 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.301842 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.301872 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.301897 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:05Z","lastTransitionTime":"2026-01-30T21:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.404487 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.404549 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.404566 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.404590 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.404606 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:05Z","lastTransitionTime":"2026-01-30T21:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.507630 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.508060 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.508265 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.508524 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.508758 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:05Z","lastTransitionTime":"2026-01-30T21:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.612243 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.612752 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.613118 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.613472 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.613842 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:05Z","lastTransitionTime":"2026-01-30T21:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.717451 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.717496 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.717507 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.717522 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.717533 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:05Z","lastTransitionTime":"2026-01-30T21:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.770986 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 04:44:00.124348994 +0000 UTC Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.817369 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.817439 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.817506 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:05 crc kubenswrapper[4914]: E0130 21:15:05.817550 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:05 crc kubenswrapper[4914]: E0130 21:15:05.817676 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:05 crc kubenswrapper[4914]: E0130 21:15:05.818054 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.823212 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.823257 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.823273 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.823293 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.823306 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:05Z","lastTransitionTime":"2026-01-30T21:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.926835 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.927069 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.927171 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.927253 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:05 crc kubenswrapper[4914]: I0130 21:15:05.927349 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:05Z","lastTransitionTime":"2026-01-30T21:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.030668 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.030738 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.030751 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.030769 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.030781 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:06Z","lastTransitionTime":"2026-01-30T21:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.134831 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.134890 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.134907 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.134932 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.134950 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:06Z","lastTransitionTime":"2026-01-30T21:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.237806 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.237880 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.237898 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.237925 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.237943 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:06Z","lastTransitionTime":"2026-01-30T21:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.342294 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.342407 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.342430 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.342504 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.342523 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:06Z","lastTransitionTime":"2026-01-30T21:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.445564 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.445618 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.445633 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.445657 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.445675 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:06Z","lastTransitionTime":"2026-01-30T21:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.548551 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.548608 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.548625 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.548650 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.548668 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:06Z","lastTransitionTime":"2026-01-30T21:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.652489 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.652544 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.652562 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.652585 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.652603 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:06Z","lastTransitionTime":"2026-01-30T21:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.682554 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.700908 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:06Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.717166 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:06Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.733474 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:06Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.756089 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.756141 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.756075 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:06Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.756155 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.756341 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.756352 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:06Z","lastTransitionTime":"2026-01-30T21:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.771161 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 00:28:13.682697404 +0000 UTC Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.773566 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:06Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.791450 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:06Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.808365 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:06Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.817829 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:06 crc kubenswrapper[4914]: E0130 21:15:06.818029 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.825516 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:06Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.845899 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:06Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.859882 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.859917 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.859932 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.859953 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.859966 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:06Z","lastTransitionTime":"2026-01-30T21:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.884695 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"message\\\":\\\"retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244547 6361 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244554 6361 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0130 21:15:01.244553 6361 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nI0130 21:15:01.244429 6361 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-wt7n5 after 0 failed attempt(s)\\\\nF0130 21:15:01.244565 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:06Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.918595 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:06Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.937775 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:06Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.963673 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.963788 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.963816 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.963853 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.963906 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:06Z","lastTransitionTime":"2026-01-30T21:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.969594 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:06Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:06 crc kubenswrapper[4914]: I0130 21:15:06.989164 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:06Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.010753 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.027158 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.048237 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.068506 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.068577 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.068598 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.068627 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.068650 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:07Z","lastTransitionTime":"2026-01-30T21:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.172077 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.172155 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.172174 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.172201 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.172220 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:07Z","lastTransitionTime":"2026-01-30T21:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.276134 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.276218 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.276244 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.276275 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.276299 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:07Z","lastTransitionTime":"2026-01-30T21:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.379556 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.379619 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.379637 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.379660 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.379678 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:07Z","lastTransitionTime":"2026-01-30T21:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.482681 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.482772 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.482790 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.482815 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.482833 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:07Z","lastTransitionTime":"2026-01-30T21:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.586360 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.586423 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.586441 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.586466 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.586485 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:07Z","lastTransitionTime":"2026-01-30T21:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.690204 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.690256 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.690274 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.690300 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.690318 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:07Z","lastTransitionTime":"2026-01-30T21:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.771918 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 15:40:20.094134739 +0000 UTC Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.793036 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.793097 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.793114 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.793140 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.793157 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:07Z","lastTransitionTime":"2026-01-30T21:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.817388 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.817507 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:07 crc kubenswrapper[4914]: E0130 21:15:07.817664 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.817924 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:07 crc kubenswrapper[4914]: E0130 21:15:07.818114 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:07 crc kubenswrapper[4914]: E0130 21:15:07.818244 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.844746 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.864557 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.884499 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.896911 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.896951 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.896964 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.896982 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.896994 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:07Z","lastTransitionTime":"2026-01-30T21:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.902418 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.920124 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.920180 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.920201 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.920226 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.920245 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:07Z","lastTransitionTime":"2026-01-30T21:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.920821 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:07 crc kubenswrapper[4914]: E0130 21:15:07.945335 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.950315 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.950368 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.950387 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.950412 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.950431 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:07Z","lastTransitionTime":"2026-01-30T21:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.952262 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"message\\\":\\\"retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244547 6361 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244554 6361 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0130 21:15:01.244553 6361 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nI0130 21:15:01.244429 6361 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-wt7n5 after 0 failed attempt(s)\\\\nF0130 21:15:01.244565 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:07 crc kubenswrapper[4914]: E0130 21:15:07.971066 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.975361 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.975405 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.975423 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.975446 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.975463 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:07Z","lastTransitionTime":"2026-01-30T21:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.985410 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:07 crc kubenswrapper[4914]: E0130 21:15:07.992136 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.996964 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.997029 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.997053 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.997084 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:07 crc kubenswrapper[4914]: I0130 21:15:07.997106 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:07Z","lastTransitionTime":"2026-01-30T21:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.006468 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:08Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:08 crc kubenswrapper[4914]: E0130 21:15:08.015062 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:08Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.018901 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.018928 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.018939 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.018954 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.018966 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:08Z","lastTransitionTime":"2026-01-30T21:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.025458 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:08Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:08 crc kubenswrapper[4914]: E0130 21:15:08.039950 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:08Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:08 crc kubenswrapper[4914]: E0130 21:15:08.040180 4914 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.042088 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.042271 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.042432 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.042580 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.042748 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:08Z","lastTransitionTime":"2026-01-30T21:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.045642 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:08Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.063795 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:08Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.081086 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:08Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.098554 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:08Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.113084 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:08Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.128460 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:08Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.142686 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:08Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.146067 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.146126 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.146139 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.146155 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.146166 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:08Z","lastTransitionTime":"2026-01-30T21:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.161294 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:08Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.252094 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.252142 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.252157 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.252175 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.252188 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:08Z","lastTransitionTime":"2026-01-30T21:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.354206 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.354242 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.354254 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.354270 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.354282 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:08Z","lastTransitionTime":"2026-01-30T21:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.457922 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.457988 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.458008 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.458033 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.458064 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:08Z","lastTransitionTime":"2026-01-30T21:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.561335 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.561407 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.561424 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.561449 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.561466 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:08Z","lastTransitionTime":"2026-01-30T21:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.664963 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.665048 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.665073 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.665101 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.665119 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:08Z","lastTransitionTime":"2026-01-30T21:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.768239 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.768293 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.768329 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.768353 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.768371 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:08Z","lastTransitionTime":"2026-01-30T21:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.772658 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 19:08:29.343876367 +0000 UTC Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.817490 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:08 crc kubenswrapper[4914]: E0130 21:15:08.817700 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.871349 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.871424 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.871446 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.871479 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.871518 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:08Z","lastTransitionTime":"2026-01-30T21:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.975406 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.975489 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.975511 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.975538 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:08 crc kubenswrapper[4914]: I0130 21:15:08.975558 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:08Z","lastTransitionTime":"2026-01-30T21:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.078948 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.079013 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.079031 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.079057 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.079075 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:09Z","lastTransitionTime":"2026-01-30T21:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.225077 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.225118 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.225127 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.225143 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.225153 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:09Z","lastTransitionTime":"2026-01-30T21:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.328454 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.328518 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.328535 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.328560 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.328578 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:09Z","lastTransitionTime":"2026-01-30T21:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.431338 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.431409 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.431427 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.431453 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.431471 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:09Z","lastTransitionTime":"2026-01-30T21:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.534658 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.534791 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.534818 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.534845 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.534862 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:09Z","lastTransitionTime":"2026-01-30T21:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.638028 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.638127 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.638179 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.638206 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.638222 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:09Z","lastTransitionTime":"2026-01-30T21:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.742414 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.742531 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.742552 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.742575 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.742593 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:09Z","lastTransitionTime":"2026-01-30T21:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.773045 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 18:18:38.12215551 +0000 UTC Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.817596 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.817809 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.817843 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:09 crc kubenswrapper[4914]: E0130 21:15:09.818019 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:09 crc kubenswrapper[4914]: E0130 21:15:09.818334 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:09 crc kubenswrapper[4914]: E0130 21:15:09.818555 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.845272 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.845323 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.845342 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.845368 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.845387 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:09Z","lastTransitionTime":"2026-01-30T21:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.947923 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.947991 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.948016 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.948045 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:09 crc kubenswrapper[4914]: I0130 21:15:09.948065 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:09Z","lastTransitionTime":"2026-01-30T21:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.050869 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.050940 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.050960 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.050985 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.051004 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:10Z","lastTransitionTime":"2026-01-30T21:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.154397 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.154446 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.154459 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.154477 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.154490 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:10Z","lastTransitionTime":"2026-01-30T21:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.256967 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.257008 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.257018 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.257034 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.257045 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:10Z","lastTransitionTime":"2026-01-30T21:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.360461 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.360526 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.360547 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.360578 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.360606 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:10Z","lastTransitionTime":"2026-01-30T21:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.463639 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.463782 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.463802 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.463858 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.463879 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:10Z","lastTransitionTime":"2026-01-30T21:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.566562 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.566622 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.566641 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.566666 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.566685 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:10Z","lastTransitionTime":"2026-01-30T21:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.669393 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.669467 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.669480 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.669498 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.669509 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:10Z","lastTransitionTime":"2026-01-30T21:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.772492 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.772543 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.772559 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.772582 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.772599 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:10Z","lastTransitionTime":"2026-01-30T21:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.773447 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 06:50:57.770990891 +0000 UTC Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.817875 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:10 crc kubenswrapper[4914]: E0130 21:15:10.818024 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.876075 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.876142 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.876165 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.876196 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.876220 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:10Z","lastTransitionTime":"2026-01-30T21:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.979465 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.979552 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.979569 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.979592 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:10 crc kubenswrapper[4914]: I0130 21:15:10.979609 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:10Z","lastTransitionTime":"2026-01-30T21:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.082512 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.082570 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.082589 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.082614 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.082633 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:11Z","lastTransitionTime":"2026-01-30T21:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.185516 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.185582 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.185605 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.185668 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.185700 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:11Z","lastTransitionTime":"2026-01-30T21:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.287639 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.287744 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.287762 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.287790 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.287808 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:11Z","lastTransitionTime":"2026-01-30T21:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.391044 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.391110 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.391129 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.391153 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.391171 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:11Z","lastTransitionTime":"2026-01-30T21:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.494104 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.494170 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.494191 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.494218 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.494239 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:11Z","lastTransitionTime":"2026-01-30T21:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.598596 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.598679 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.598698 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.598747 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.598769 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:11Z","lastTransitionTime":"2026-01-30T21:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.702276 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.702621 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.702814 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.702961 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.703102 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:11Z","lastTransitionTime":"2026-01-30T21:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.774372 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 02:29:48.898888205 +0000 UTC Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.805771 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.805830 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.805848 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.805873 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.805890 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:11Z","lastTransitionTime":"2026-01-30T21:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.817478 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.817609 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:11 crc kubenswrapper[4914]: E0130 21:15:11.817672 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:11 crc kubenswrapper[4914]: E0130 21:15:11.817862 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.818075 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:11 crc kubenswrapper[4914]: E0130 21:15:11.818402 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.909380 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.909452 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.909469 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.909495 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:11 crc kubenswrapper[4914]: I0130 21:15:11.909513 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:11Z","lastTransitionTime":"2026-01-30T21:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.012584 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.012640 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.012656 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.012680 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.012697 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:12Z","lastTransitionTime":"2026-01-30T21:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.115638 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.115745 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.115771 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.115800 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.115817 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:12Z","lastTransitionTime":"2026-01-30T21:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.218322 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.218395 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.218418 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.218450 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.218474 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:12Z","lastTransitionTime":"2026-01-30T21:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.322123 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.322184 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.322204 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.322230 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.322249 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:12Z","lastTransitionTime":"2026-01-30T21:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.426269 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.426347 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.426372 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.426408 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.426433 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:12Z","lastTransitionTime":"2026-01-30T21:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.529257 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.529321 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.529344 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.529374 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.529398 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:12Z","lastTransitionTime":"2026-01-30T21:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.633361 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.633408 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.633424 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.633448 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.633465 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:12Z","lastTransitionTime":"2026-01-30T21:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.736558 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.736636 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.736654 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.736679 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.736697 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:12Z","lastTransitionTime":"2026-01-30T21:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.775275 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 07:17:46.787082572 +0000 UTC Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.817178 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:12 crc kubenswrapper[4914]: E0130 21:15:12.817360 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.839609 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.839671 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.839690 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.839745 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.839772 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:12Z","lastTransitionTime":"2026-01-30T21:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.942450 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.942529 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.942551 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.942581 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:12 crc kubenswrapper[4914]: I0130 21:15:12.942602 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:12Z","lastTransitionTime":"2026-01-30T21:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.046066 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.046139 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.046163 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.046194 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.046216 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:13Z","lastTransitionTime":"2026-01-30T21:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.149305 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.149358 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.149380 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.149404 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.149421 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:13Z","lastTransitionTime":"2026-01-30T21:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.252747 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.252808 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.252828 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.252853 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.252872 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:13Z","lastTransitionTime":"2026-01-30T21:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.355496 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.355553 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.355570 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.355593 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.355612 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:13Z","lastTransitionTime":"2026-01-30T21:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.458367 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.458437 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.458461 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.458492 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.458515 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:13Z","lastTransitionTime":"2026-01-30T21:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.561847 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.561912 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.561930 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.561956 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.561977 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:13Z","lastTransitionTime":"2026-01-30T21:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.664547 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.664598 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.664615 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.664640 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.664657 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:13Z","lastTransitionTime":"2026-01-30T21:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.768144 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.768205 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.768231 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.768254 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.768271 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:13Z","lastTransitionTime":"2026-01-30T21:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.775739 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 01:20:09.471425282 +0000 UTC Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.817396 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.817474 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.817494 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:13 crc kubenswrapper[4914]: E0130 21:15:13.817574 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:13 crc kubenswrapper[4914]: E0130 21:15:13.817702 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:13 crc kubenswrapper[4914]: E0130 21:15:13.817921 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.870801 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.870858 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.870875 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.870911 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.870930 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:13Z","lastTransitionTime":"2026-01-30T21:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.973088 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.973151 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.973174 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.973205 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:13 crc kubenswrapper[4914]: I0130 21:15:13.973226 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:13Z","lastTransitionTime":"2026-01-30T21:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.075805 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.075860 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.075883 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.075913 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.075935 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:14Z","lastTransitionTime":"2026-01-30T21:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.178589 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.178645 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.178663 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.178687 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.178732 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:14Z","lastTransitionTime":"2026-01-30T21:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.281767 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.281871 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.281891 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.281916 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.281934 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:14Z","lastTransitionTime":"2026-01-30T21:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.384636 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.384680 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.384697 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.384750 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.384773 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:14Z","lastTransitionTime":"2026-01-30T21:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.487517 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.487593 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.487616 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.487641 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.487657 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:14Z","lastTransitionTime":"2026-01-30T21:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.590939 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.590975 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.590985 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.591001 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.591012 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:14Z","lastTransitionTime":"2026-01-30T21:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.694508 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.694562 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.694578 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.694664 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.694682 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:14Z","lastTransitionTime":"2026-01-30T21:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.776744 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 03:17:58.870035492 +0000 UTC Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.797790 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.797843 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.797859 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.797882 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.797899 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:14Z","lastTransitionTime":"2026-01-30T21:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.817642 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:14 crc kubenswrapper[4914]: E0130 21:15:14.817745 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.901025 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.901088 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.901105 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.901129 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:14 crc kubenswrapper[4914]: I0130 21:15:14.901147 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:14Z","lastTransitionTime":"2026-01-30T21:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.003669 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.003782 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.003838 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.003866 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.003886 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:15Z","lastTransitionTime":"2026-01-30T21:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.107059 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.107118 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.107139 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.107169 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.107195 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:15Z","lastTransitionTime":"2026-01-30T21:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.210227 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.210258 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.210269 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.210284 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.210295 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:15Z","lastTransitionTime":"2026-01-30T21:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.312687 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.312814 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.312840 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.312869 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.312892 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:15Z","lastTransitionTime":"2026-01-30T21:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.415644 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.415694 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.415744 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.415774 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.415791 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:15Z","lastTransitionTime":"2026-01-30T21:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.518748 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.518801 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.518820 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.518844 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.518861 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:15Z","lastTransitionTime":"2026-01-30T21:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.621507 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.621583 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.621609 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.621639 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.621662 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:15Z","lastTransitionTime":"2026-01-30T21:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.724615 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.724692 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.724747 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.724778 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.724799 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:15Z","lastTransitionTime":"2026-01-30T21:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.777564 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 07:09:44.442227582 +0000 UTC Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.817815 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:15 crc kubenswrapper[4914]: E0130 21:15:15.817986 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.818072 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.818260 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:15 crc kubenswrapper[4914]: E0130 21:15:15.818363 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:15 crc kubenswrapper[4914]: E0130 21:15:15.818471 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.827040 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.827114 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.827143 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.827176 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.827200 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:15Z","lastTransitionTime":"2026-01-30T21:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.930370 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.930428 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.930447 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.930471 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:15 crc kubenswrapper[4914]: I0130 21:15:15.930489 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:15Z","lastTransitionTime":"2026-01-30T21:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.033537 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.033593 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.033610 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.033632 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.033649 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:16Z","lastTransitionTime":"2026-01-30T21:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.136613 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.136664 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.136684 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.136736 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.136755 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:16Z","lastTransitionTime":"2026-01-30T21:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.240002 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.240055 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.240075 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.240098 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.240116 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:16Z","lastTransitionTime":"2026-01-30T21:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.343626 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.343693 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.343737 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.343763 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.343781 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:16Z","lastTransitionTime":"2026-01-30T21:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.447036 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.447083 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.447099 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.447155 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.447173 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:16Z","lastTransitionTime":"2026-01-30T21:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.550094 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.550165 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.550188 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.550217 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.550241 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:16Z","lastTransitionTime":"2026-01-30T21:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.653838 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.653897 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.653915 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.653940 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.653960 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:16Z","lastTransitionTime":"2026-01-30T21:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.757389 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.757448 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.757464 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.757488 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.757509 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:16Z","lastTransitionTime":"2026-01-30T21:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.778303 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 10:10:32.507999484 +0000 UTC Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.817919 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:16 crc kubenswrapper[4914]: E0130 21:15:16.818130 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.860656 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.860743 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.860763 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.860788 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.860806 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:16Z","lastTransitionTime":"2026-01-30T21:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.963857 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.963934 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.963965 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.963995 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:16 crc kubenswrapper[4914]: I0130 21:15:16.964018 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:16Z","lastTransitionTime":"2026-01-30T21:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.067324 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.067386 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.067403 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.067433 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.067451 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:17Z","lastTransitionTime":"2026-01-30T21:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.176688 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.176770 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.176788 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.176813 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.176832 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:17Z","lastTransitionTime":"2026-01-30T21:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.279924 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.279975 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.279993 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.280018 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.280034 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:17Z","lastTransitionTime":"2026-01-30T21:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.383388 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.383451 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.383468 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.383491 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.383509 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:17Z","lastTransitionTime":"2026-01-30T21:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.486428 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.486497 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.486515 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.486542 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.486560 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:17Z","lastTransitionTime":"2026-01-30T21:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.589398 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.589461 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.589480 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.589505 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.589522 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:17Z","lastTransitionTime":"2026-01-30T21:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.692442 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.692503 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.692519 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.692542 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.692560 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:17Z","lastTransitionTime":"2026-01-30T21:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.778692 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 06:28:26.52931534 +0000 UTC Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.795224 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.795284 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.795301 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.795326 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.795345 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:17Z","lastTransitionTime":"2026-01-30T21:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.817650 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.817881 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:17 crc kubenswrapper[4914]: E0130 21:15:17.818081 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.818137 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:17 crc kubenswrapper[4914]: E0130 21:15:17.818297 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:17 crc kubenswrapper[4914]: E0130 21:15:17.818439 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.820209 4914 scope.go:117] "RemoveContainer" containerID="613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.833265 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.854942 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.875987 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.899212 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.899267 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.899285 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.899309 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.899327 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:17Z","lastTransitionTime":"2026-01-30T21:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.899800 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.920124 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.938696 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.955006 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.975966 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:17 crc kubenswrapper[4914]: I0130 21:15:17.996769 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.002581 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.002647 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.002672 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.002743 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.002775 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:18Z","lastTransitionTime":"2026-01-30T21:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.018592 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.040450 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.070334 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.085471 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.085505 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.085521 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.085543 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.085557 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:18Z","lastTransitionTime":"2026-01-30T21:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.094074 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: E0130 21:15:18.104655 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.108638 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.108673 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.108684 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.108702 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.108731 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:18Z","lastTransitionTime":"2026-01-30T21:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.121517 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"message\\\":\\\"retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244547 6361 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244554 6361 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0130 21:15:01.244553 6361 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nI0130 21:15:01.244429 6361 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-wt7n5 after 0 failed attempt(s)\\\\nF0130 21:15:01.244565 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: E0130 21:15:18.129262 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.135001 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.135037 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.135049 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.135065 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.135077 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:18Z","lastTransitionTime":"2026-01-30T21:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:18 crc kubenswrapper[4914]: E0130 21:15:18.151198 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.156187 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.156255 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.156281 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.156315 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.156340 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:18Z","lastTransitionTime":"2026-01-30T21:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.156466 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.172842 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: E0130 21:15:18.182007 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.186246 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.186346 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.186366 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.186389 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.186410 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:18Z","lastTransitionTime":"2026-01-30T21:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.197400 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.199488 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/1.log" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.204444 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerStarted","Data":"58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0"} Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.204999 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:15:18 crc kubenswrapper[4914]: E0130 21:15:18.210516 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: E0130 21:15:18.210678 4914 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.213615 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.213646 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.213659 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.213675 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.213687 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:18Z","lastTransitionTime":"2026-01-30T21:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.229430 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"message\\\":\\\"retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244547 6361 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244554 6361 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0130 21:15:01.244553 6361 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nI0130 21:15:01.244429 6361 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-wt7n5 after 0 failed attempt(s)\\\\nF0130 21:15:01.244565 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.255369 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.280576 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.301241 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.317059 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.317136 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.317168 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.317202 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.317261 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:18Z","lastTransitionTime":"2026-01-30T21:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.319807 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.341085 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.362785 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.388333 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.403395 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.415409 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.419946 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.419969 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.419977 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.419989 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.419998 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:18Z","lastTransitionTime":"2026-01-30T21:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.424234 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.433612 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.448364 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.462106 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.473519 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.485301 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.495289 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.522552 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.522593 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.522623 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.522643 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.522654 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:18Z","lastTransitionTime":"2026-01-30T21:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.626094 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.626178 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.626195 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.626220 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.626235 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:18Z","lastTransitionTime":"2026-01-30T21:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.733065 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.733306 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.733399 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.733436 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.733470 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:18Z","lastTransitionTime":"2026-01-30T21:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.779652 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 02:13:44.466683881 +0000 UTC Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.817392 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:18 crc kubenswrapper[4914]: E0130 21:15:18.817634 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.837265 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.837640 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.837872 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.838147 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.838354 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:18Z","lastTransitionTime":"2026-01-30T21:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.941863 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.941947 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.941966 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.941991 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:18 crc kubenswrapper[4914]: I0130 21:15:18.942009 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:18Z","lastTransitionTime":"2026-01-30T21:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.045545 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.045612 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.045629 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.045655 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.045673 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:19Z","lastTransitionTime":"2026-01-30T21:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.149372 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.149432 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.149448 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.149471 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.149488 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:19Z","lastTransitionTime":"2026-01-30T21:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.212156 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/2.log" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.213550 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/1.log" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.218038 4914 generic.go:334] "Generic (PLEG): container finished" podID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerID="58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0" exitCode=1 Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.218098 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerDied","Data":"58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0"} Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.218162 4914 scope.go:117] "RemoveContainer" containerID="613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.219206 4914 scope.go:117] "RemoveContainer" containerID="58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0" Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.219474 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.238795 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.252258 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.252312 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.252330 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.252359 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.252378 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:19Z","lastTransitionTime":"2026-01-30T21:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.255120 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.271463 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.292571 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.309023 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.327007 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.343453 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.355490 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.355551 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.355571 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.355597 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.355615 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:19Z","lastTransitionTime":"2026-01-30T21:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.359816 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.390952 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://613d24fe5244200452599a9256eef7ab842b0658b4693064bc50583cd90b8beb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"message\\\":\\\"retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244547 6361 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0130 21:15:01.244554 6361 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0130 21:15:01.244553 6361 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nI0130 21:15:01.244429 6361 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-wt7n5 after 0 failed attempt(s)\\\\nF0130 21:15:01.244565 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"d/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849198 6572 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849491 6572 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849603 6572 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849611 6572 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849912 6572 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.850352 6572 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.850741 6572 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:15:18.850812 6572 factory.go:656] Stopping watch factory\\\\nI0130 21:15:18.850833 6572 ovnkube.go:599] Stopped ovnkube\\\\nI0130 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.422537 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.442797 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.459281 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.459334 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.459353 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.459377 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.459396 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:19Z","lastTransitionTime":"2026-01-30T21:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.462423 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.481040 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.500261 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.521268 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.545005 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.559429 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.562763 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.562827 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.562852 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.562888 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.562912 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:19Z","lastTransitionTime":"2026-01-30T21:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.565107 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.565197 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.565223 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.565243 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.565347 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.565347 4914 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.565360 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.565434 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:51.565406686 +0000 UTC m=+85.004043487 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.565451 4914 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.565468 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:15:51.565450137 +0000 UTC m=+85.004086938 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.565505 4914 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.565541 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:51.565517088 +0000 UTC m=+85.004153889 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.565593 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:51.565566369 +0000 UTC m=+85.004203160 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.665775 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.665958 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.665985 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.665998 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.666017 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.666031 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:19Z","lastTransitionTime":"2026-01-30T21:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.666052 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.666107 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.666135 4914 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.666246 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:51.666214028 +0000 UTC m=+85.104850869 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.766804 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs\") pod \"network-metrics-daemon-c2klk\" (UID: \"8a911963-1d06-47d0-8f70-d81d5bd47496\") " pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.766987 4914 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.767095 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs podName:8a911963-1d06-47d0-8f70-d81d5bd47496 nodeName:}" failed. No retries permitted until 2026-01-30 21:15:51.76706427 +0000 UTC m=+85.205701071 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs") pod "network-metrics-daemon-c2klk" (UID: "8a911963-1d06-47d0-8f70-d81d5bd47496") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.768755 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.768802 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.768826 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.768854 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.768876 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:19Z","lastTransitionTime":"2026-01-30T21:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.780755 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 23:00:18.034771176 +0000 UTC Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.817630 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.817842 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.817895 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.817955 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.818118 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:19 crc kubenswrapper[4914]: E0130 21:15:19.818229 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.871596 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.871659 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.871681 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.871748 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.871776 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:19Z","lastTransitionTime":"2026-01-30T21:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.975161 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.975220 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.975240 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.975267 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:19 crc kubenswrapper[4914]: I0130 21:15:19.975286 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:19Z","lastTransitionTime":"2026-01-30T21:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.078085 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.078153 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.078169 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.078193 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.078213 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:20Z","lastTransitionTime":"2026-01-30T21:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.181922 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.181980 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.181998 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.182020 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.182038 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:20Z","lastTransitionTime":"2026-01-30T21:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.224991 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/2.log" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.229940 4914 scope.go:117] "RemoveContainer" containerID="58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0" Jan 30 21:15:20 crc kubenswrapper[4914]: E0130 21:15:20.230189 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.250344 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.270135 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.285789 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.285878 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.285932 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.286067 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.286135 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:20Z","lastTransitionTime":"2026-01-30T21:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.289519 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.307855 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.328091 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.359395 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"d/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849198 6572 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849491 6572 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849603 6572 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849611 6572 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849912 6572 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.850352 6572 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.850741 6572 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:15:18.850812 6572 factory.go:656] Stopping watch factory\\\\nI0130 21:15:18.850833 6572 ovnkube.go:599] Stopped ovnkube\\\\nI0130 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.389464 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.389544 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.389571 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.389605 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.389631 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:20Z","lastTransitionTime":"2026-01-30T21:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.392528 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.409883 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.433277 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.448606 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.464540 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.485466 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.494842 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.494941 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.494959 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.494983 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.495001 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:20Z","lastTransitionTime":"2026-01-30T21:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.509781 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.529243 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.548741 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.566148 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.586323 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.597775 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.597827 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.597847 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.597871 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.597889 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:20Z","lastTransitionTime":"2026-01-30T21:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.700698 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.700788 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.700806 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.700833 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.700851 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:20Z","lastTransitionTime":"2026-01-30T21:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.781643 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 18:40:35.570873866 +0000 UTC Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.803880 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.803946 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.803964 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.803993 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.804011 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:20Z","lastTransitionTime":"2026-01-30T21:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.817276 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:20 crc kubenswrapper[4914]: E0130 21:15:20.817448 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.906588 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.906648 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.906667 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.906692 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:20 crc kubenswrapper[4914]: I0130 21:15:20.906748 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:20Z","lastTransitionTime":"2026-01-30T21:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.010178 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.010246 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.010266 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.010290 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.010309 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:21Z","lastTransitionTime":"2026-01-30T21:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.113160 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.113198 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.113209 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.113229 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.113242 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:21Z","lastTransitionTime":"2026-01-30T21:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.215915 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.215960 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.215976 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.215999 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.216018 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:21Z","lastTransitionTime":"2026-01-30T21:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.318138 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.318201 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.318219 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.318242 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.318259 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:21Z","lastTransitionTime":"2026-01-30T21:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.421656 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.421752 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.421773 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.421803 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.421823 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:21Z","lastTransitionTime":"2026-01-30T21:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.525004 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.525063 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.525083 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.525107 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.525125 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:21Z","lastTransitionTime":"2026-01-30T21:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.627652 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.627769 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.627789 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.627813 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.627830 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:21Z","lastTransitionTime":"2026-01-30T21:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.730431 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.730499 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.730518 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.730542 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.730560 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:21Z","lastTransitionTime":"2026-01-30T21:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.782392 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 23:00:58.904448455 +0000 UTC Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.817525 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.817541 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.817798 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:21 crc kubenswrapper[4914]: E0130 21:15:21.817691 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:21 crc kubenswrapper[4914]: E0130 21:15:21.817896 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:21 crc kubenswrapper[4914]: E0130 21:15:21.817987 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.833222 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.833264 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.833275 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.833291 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.833303 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:21Z","lastTransitionTime":"2026-01-30T21:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.936978 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.937041 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.937107 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.937135 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:21 crc kubenswrapper[4914]: I0130 21:15:21.937152 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:21Z","lastTransitionTime":"2026-01-30T21:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.040667 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.040819 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.040842 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.040872 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.040895 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:22Z","lastTransitionTime":"2026-01-30T21:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.143871 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.143944 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.143968 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.144003 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.144026 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:22Z","lastTransitionTime":"2026-01-30T21:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.247048 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.247134 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.247155 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.247182 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.247202 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:22Z","lastTransitionTime":"2026-01-30T21:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.350914 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.350985 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.351008 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.351040 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.351062 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:22Z","lastTransitionTime":"2026-01-30T21:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.453995 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.454072 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.454090 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.454115 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.454133 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:22Z","lastTransitionTime":"2026-01-30T21:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.557977 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.558054 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.558079 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.558105 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.558123 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:22Z","lastTransitionTime":"2026-01-30T21:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.661636 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.661739 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.661767 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.661804 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.661829 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:22Z","lastTransitionTime":"2026-01-30T21:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.764291 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.764378 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.764405 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.764433 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.764453 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:22Z","lastTransitionTime":"2026-01-30T21:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.783004 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 02:20:53.509308436 +0000 UTC Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.791919 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.804844 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.817058 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:22 crc kubenswrapper[4914]: E0130 21:15:22.817229 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.824959 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.843637 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.863042 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.867639 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.867696 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.867744 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.867768 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.867787 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:22Z","lastTransitionTime":"2026-01-30T21:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.882146 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.903245 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.924252 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.960101 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"d/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849198 6572 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849491 6572 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849603 6572 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849611 6572 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849912 6572 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.850352 6572 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.850741 6572 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:15:18.850812 6572 factory.go:656] Stopping watch factory\\\\nI0130 21:15:18.850833 6572 ovnkube.go:599] Stopped ovnkube\\\\nI0130 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.970567 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.970703 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.970777 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.970802 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.970861 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:22Z","lastTransitionTime":"2026-01-30T21:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:22 crc kubenswrapper[4914]: I0130 21:15:22.984619 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.000656 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.020490 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.035595 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.052803 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.074013 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.074088 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.074115 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.074143 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.074167 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:23Z","lastTransitionTime":"2026-01-30T21:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.075412 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.095115 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.113409 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.132060 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.147842 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.177274 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.177340 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.177364 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.177395 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.177418 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:23Z","lastTransitionTime":"2026-01-30T21:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.280267 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.280336 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.280362 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.280392 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.280416 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:23Z","lastTransitionTime":"2026-01-30T21:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.383392 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.383466 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.383486 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.383509 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.383527 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:23Z","lastTransitionTime":"2026-01-30T21:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.486938 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.486992 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.487008 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.487049 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.487067 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:23Z","lastTransitionTime":"2026-01-30T21:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.589941 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.589996 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.590013 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.590037 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.590057 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:23Z","lastTransitionTime":"2026-01-30T21:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.693333 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.693393 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.693414 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.693442 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.693463 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:23Z","lastTransitionTime":"2026-01-30T21:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.783928 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 18:03:09.382451168 +0000 UTC Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.796317 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.796504 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.796632 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.796830 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.797228 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:23Z","lastTransitionTime":"2026-01-30T21:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.819006 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:23 crc kubenswrapper[4914]: E0130 21:15:23.819183 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.819015 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.819258 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:23 crc kubenswrapper[4914]: E0130 21:15:23.819306 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:23 crc kubenswrapper[4914]: E0130 21:15:23.819402 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.900822 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.900895 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.900917 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.900947 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:23 crc kubenswrapper[4914]: I0130 21:15:23.900970 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:23Z","lastTransitionTime":"2026-01-30T21:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.004102 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.004163 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.004184 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.004211 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.004230 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:24Z","lastTransitionTime":"2026-01-30T21:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.106746 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.106798 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.106814 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.106837 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.106855 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:24Z","lastTransitionTime":"2026-01-30T21:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.209487 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.209545 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.209562 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.209587 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.209604 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:24Z","lastTransitionTime":"2026-01-30T21:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.312680 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.312765 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.312785 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.312808 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.312824 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:24Z","lastTransitionTime":"2026-01-30T21:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.416031 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.416110 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.416135 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.416167 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.416192 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:24Z","lastTransitionTime":"2026-01-30T21:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.518806 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.518851 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.518867 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.518888 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.518908 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:24Z","lastTransitionTime":"2026-01-30T21:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.622206 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.622280 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.622306 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.622340 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.622363 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:24Z","lastTransitionTime":"2026-01-30T21:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.725173 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.725229 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.725251 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.725282 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.725302 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:24Z","lastTransitionTime":"2026-01-30T21:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.784868 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 16:27:23.494495427 +0000 UTC Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.817230 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:24 crc kubenswrapper[4914]: E0130 21:15:24.817391 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.828405 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.828464 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.828483 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.828505 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.828537 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:24Z","lastTransitionTime":"2026-01-30T21:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.931180 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.931364 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.931383 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.931405 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:24 crc kubenswrapper[4914]: I0130 21:15:24.931422 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:24Z","lastTransitionTime":"2026-01-30T21:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.033795 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.033845 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.033865 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.033888 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.033906 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:25Z","lastTransitionTime":"2026-01-30T21:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.136777 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.136854 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.136872 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.136894 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.136948 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:25Z","lastTransitionTime":"2026-01-30T21:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.240050 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.240099 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.240119 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.240139 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.240157 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:25Z","lastTransitionTime":"2026-01-30T21:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.342534 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.342591 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.342608 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.342631 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.342648 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:25Z","lastTransitionTime":"2026-01-30T21:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.445677 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.445772 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.445791 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.445818 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.445834 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:25Z","lastTransitionTime":"2026-01-30T21:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.549112 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.549172 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.549191 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.549215 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.549233 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:25Z","lastTransitionTime":"2026-01-30T21:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.651584 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.651640 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.651657 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.651683 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.651736 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:25Z","lastTransitionTime":"2026-01-30T21:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.754201 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.754528 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.754834 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.755015 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.755171 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:25Z","lastTransitionTime":"2026-01-30T21:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.785989 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 15:14:42.354263838 +0000 UTC Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.817576 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.817678 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.817801 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:25 crc kubenswrapper[4914]: E0130 21:15:25.817983 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:25 crc kubenswrapper[4914]: E0130 21:15:25.818165 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:25 crc kubenswrapper[4914]: E0130 21:15:25.818423 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.858009 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.858061 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.858083 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.858113 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.858136 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:25Z","lastTransitionTime":"2026-01-30T21:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.960642 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.960701 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.960762 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.960788 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:25 crc kubenswrapper[4914]: I0130 21:15:25.960805 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:25Z","lastTransitionTime":"2026-01-30T21:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.064178 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.064251 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.064269 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.064298 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.064317 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:26Z","lastTransitionTime":"2026-01-30T21:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.167022 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.167055 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.167063 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.167076 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.167085 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:26Z","lastTransitionTime":"2026-01-30T21:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.269665 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.269752 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.269771 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.269797 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.269816 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:26Z","lastTransitionTime":"2026-01-30T21:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.372480 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.372543 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.372563 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.372587 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.372604 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:26Z","lastTransitionTime":"2026-01-30T21:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.475673 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.475758 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.475777 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.475800 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.475819 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:26Z","lastTransitionTime":"2026-01-30T21:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.579046 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.579121 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.579145 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.579175 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.579197 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:26Z","lastTransitionTime":"2026-01-30T21:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.682467 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.682540 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.682563 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.682593 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.682616 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:26Z","lastTransitionTime":"2026-01-30T21:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.786037 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.787304 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 06:12:20.961047623 +0000 UTC Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.787918 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.787978 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.788007 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.788028 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:26Z","lastTransitionTime":"2026-01-30T21:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.818085 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:26 crc kubenswrapper[4914]: E0130 21:15:26.818315 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.896403 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.896471 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.896489 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.896516 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:26 crc kubenswrapper[4914]: I0130 21:15:26.896534 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:26Z","lastTransitionTime":"2026-01-30T21:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.000000 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.000057 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.000074 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.000098 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.000115 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:27Z","lastTransitionTime":"2026-01-30T21:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.103962 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.104019 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.104036 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.104062 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.104080 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:27Z","lastTransitionTime":"2026-01-30T21:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.207308 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.207394 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.207413 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.207439 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.207456 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:27Z","lastTransitionTime":"2026-01-30T21:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.310489 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.310568 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.310594 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.310642 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.310660 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:27Z","lastTransitionTime":"2026-01-30T21:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.413291 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.413374 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.413401 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.413430 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.413448 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:27Z","lastTransitionTime":"2026-01-30T21:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.516426 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.516507 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.516530 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.516577 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.516601 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:27Z","lastTransitionTime":"2026-01-30T21:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.620018 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.620099 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.620120 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.620155 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.620173 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:27Z","lastTransitionTime":"2026-01-30T21:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.723619 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.723743 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.723774 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.723807 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.723830 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:27Z","lastTransitionTime":"2026-01-30T21:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.788630 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 12:17:31.590321744 +0000 UTC Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.819244 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.819304 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:27 crc kubenswrapper[4914]: E0130 21:15:27.819628 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.819686 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:27 crc kubenswrapper[4914]: E0130 21:15:27.820216 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:27 crc kubenswrapper[4914]: E0130 21:15:27.820431 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.826527 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.826585 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.826605 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.826635 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.826658 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:27Z","lastTransitionTime":"2026-01-30T21:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.844658 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.876860 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.900529 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.918869 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.928915 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.928971 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.928984 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.929001 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.929014 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:27Z","lastTransitionTime":"2026-01-30T21:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.948166 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.961853 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.976336 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0db45423-3fbc-4398-8b7f-4ca6dc5c26ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8759bf42864facd5f47819968351923fb2c65ccf597f6cf9ff7c60d9e3b036e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1aff6c5242fbe5e0d1e6c200c68b781af18f97900c3464e114bebb27a500f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baf15cc7ec88d6b3f78cf54a42cef4f7082519e8256fdade2d4882cd4a879f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:27 crc kubenswrapper[4914]: I0130 21:15:27.993531 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.010955 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.028839 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.032289 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.032367 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.032388 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.032421 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.032441 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:28Z","lastTransitionTime":"2026-01-30T21:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.048613 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.078922 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"d/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849198 6572 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849491 6572 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849603 6572 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849611 6572 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849912 6572 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.850352 6572 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.850741 6572 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:15:18.850812 6572 factory.go:656] Stopping watch factory\\\\nI0130 21:15:18.850833 6572 ovnkube.go:599] Stopped ovnkube\\\\nI0130 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.103818 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.119312 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.135684 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.135983 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.136069 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.136182 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.136284 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:28Z","lastTransitionTime":"2026-01-30T21:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.141256 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.155426 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.171067 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.194790 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.238365 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.238438 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.238464 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.238497 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.238521 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:28Z","lastTransitionTime":"2026-01-30T21:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.307272 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.307318 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.307330 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.307349 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.307365 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:28Z","lastTransitionTime":"2026-01-30T21:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:28 crc kubenswrapper[4914]: E0130 21:15:28.325923 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.331744 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.331794 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.331811 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.331831 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.331849 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:28Z","lastTransitionTime":"2026-01-30T21:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:28 crc kubenswrapper[4914]: E0130 21:15:28.350784 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.355917 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.356224 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.356369 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.356504 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.356655 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:28Z","lastTransitionTime":"2026-01-30T21:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:28 crc kubenswrapper[4914]: E0130 21:15:28.377995 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.384070 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.384119 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.384142 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.384167 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.384212 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:28Z","lastTransitionTime":"2026-01-30T21:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:28 crc kubenswrapper[4914]: E0130 21:15:28.407281 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.412875 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.412942 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.412962 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.412990 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.413009 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:28Z","lastTransitionTime":"2026-01-30T21:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:28 crc kubenswrapper[4914]: E0130 21:15:28.434896 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:28Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:28 crc kubenswrapper[4914]: E0130 21:15:28.435409 4914 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.438431 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.438478 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.438494 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.438553 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.438574 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:28Z","lastTransitionTime":"2026-01-30T21:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.541796 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.541840 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.541853 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.541872 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.541884 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:28Z","lastTransitionTime":"2026-01-30T21:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.645493 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.645556 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.645582 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.645613 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.645634 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:28Z","lastTransitionTime":"2026-01-30T21:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.748969 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.749032 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.749049 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.749074 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.749090 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:28Z","lastTransitionTime":"2026-01-30T21:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.789634 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 04:17:14.700517695 +0000 UTC Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.817101 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:28 crc kubenswrapper[4914]: E0130 21:15:28.817333 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.852945 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.853372 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.853610 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.853877 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.854104 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:28Z","lastTransitionTime":"2026-01-30T21:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.957072 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.957176 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.957202 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.957234 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:28 crc kubenswrapper[4914]: I0130 21:15:28.957255 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:28Z","lastTransitionTime":"2026-01-30T21:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.059688 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.059775 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.059793 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.059818 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.059837 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:29Z","lastTransitionTime":"2026-01-30T21:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.162833 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.162907 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.162925 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.162949 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.162967 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:29Z","lastTransitionTime":"2026-01-30T21:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.264936 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.265096 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.265115 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.265138 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.265155 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:29Z","lastTransitionTime":"2026-01-30T21:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.368650 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.368927 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.369117 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.369276 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.369416 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:29Z","lastTransitionTime":"2026-01-30T21:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.472751 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.473106 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.473265 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.473413 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.473558 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:29Z","lastTransitionTime":"2026-01-30T21:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.576183 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.576401 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.576559 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.576698 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.576913 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:29Z","lastTransitionTime":"2026-01-30T21:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.679853 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.679926 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.679949 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.679978 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.680000 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:29Z","lastTransitionTime":"2026-01-30T21:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.783393 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.783473 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.783488 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.783508 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.783522 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:29Z","lastTransitionTime":"2026-01-30T21:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.790668 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 21:05:31.066392734 +0000 UTC Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.817324 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.817344 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.817439 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:29 crc kubenswrapper[4914]: E0130 21:15:29.817623 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:29 crc kubenswrapper[4914]: E0130 21:15:29.818100 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:29 crc kubenswrapper[4914]: E0130 21:15:29.818219 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.886601 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.886666 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.886685 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.886756 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.886777 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:29Z","lastTransitionTime":"2026-01-30T21:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.989810 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.989881 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.989893 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.989912 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:29 crc kubenswrapper[4914]: I0130 21:15:29.989925 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:29Z","lastTransitionTime":"2026-01-30T21:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.092655 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.092730 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.092749 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.092768 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.092783 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:30Z","lastTransitionTime":"2026-01-30T21:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.196782 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.196840 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.196860 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.196885 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.196904 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:30Z","lastTransitionTime":"2026-01-30T21:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.299646 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.299690 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.299699 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.299726 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.299738 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:30Z","lastTransitionTime":"2026-01-30T21:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.403045 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.403110 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.403127 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.403154 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.403172 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:30Z","lastTransitionTime":"2026-01-30T21:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.506592 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.506653 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.506674 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.506697 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.506756 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:30Z","lastTransitionTime":"2026-01-30T21:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.610311 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.610371 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.610391 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.610416 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.610435 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:30Z","lastTransitionTime":"2026-01-30T21:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.713411 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.713522 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.713542 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.713569 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.713593 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:30Z","lastTransitionTime":"2026-01-30T21:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.791548 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 21:10:41.686968565 +0000 UTC Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.816640 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.816691 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.816733 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.816757 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.816775 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:30Z","lastTransitionTime":"2026-01-30T21:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.817038 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:30 crc kubenswrapper[4914]: E0130 21:15:30.817239 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.919820 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.919875 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.919892 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.919915 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:30 crc kubenswrapper[4914]: I0130 21:15:30.919933 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:30Z","lastTransitionTime":"2026-01-30T21:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.023358 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.023432 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.023456 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.023484 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.023504 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:31Z","lastTransitionTime":"2026-01-30T21:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.125980 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.126061 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.126081 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.126115 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.126138 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:31Z","lastTransitionTime":"2026-01-30T21:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.229309 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.229353 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.229367 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.229389 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.229400 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:31Z","lastTransitionTime":"2026-01-30T21:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.332764 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.332803 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.332811 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.332825 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.332835 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:31Z","lastTransitionTime":"2026-01-30T21:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.435468 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.435502 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.435511 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.435525 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.435534 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:31Z","lastTransitionTime":"2026-01-30T21:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.538441 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.538476 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.538484 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.538498 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.538509 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:31Z","lastTransitionTime":"2026-01-30T21:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.640792 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.640820 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.640828 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.640840 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.640850 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:31Z","lastTransitionTime":"2026-01-30T21:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.743493 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.743572 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.743596 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.743627 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.743649 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:31Z","lastTransitionTime":"2026-01-30T21:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.791868 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 09:09:01.522850024 +0000 UTC Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.817612 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.817755 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.818013 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:31 crc kubenswrapper[4914]: E0130 21:15:31.818010 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:31 crc kubenswrapper[4914]: E0130 21:15:31.818189 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:31 crc kubenswrapper[4914]: E0130 21:15:31.818309 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.819281 4914 scope.go:117] "RemoveContainer" containerID="58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0" Jan 30 21:15:31 crc kubenswrapper[4914]: E0130 21:15:31.819601 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.846423 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.846468 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.846486 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.846507 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.846525 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:31Z","lastTransitionTime":"2026-01-30T21:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.949053 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.949099 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.949116 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.949138 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:31 crc kubenswrapper[4914]: I0130 21:15:31.949154 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:31Z","lastTransitionTime":"2026-01-30T21:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.051912 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.051970 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.051988 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.052012 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.052029 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:32Z","lastTransitionTime":"2026-01-30T21:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.154913 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.154951 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.154962 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.154978 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.154991 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:32Z","lastTransitionTime":"2026-01-30T21:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.256887 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.256982 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.256997 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.257015 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.257026 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:32Z","lastTransitionTime":"2026-01-30T21:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.359267 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.359323 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.359340 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.359368 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.359387 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:32Z","lastTransitionTime":"2026-01-30T21:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.461473 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.461505 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.461515 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.461529 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.461539 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:32Z","lastTransitionTime":"2026-01-30T21:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.564961 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.564989 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.564998 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.565011 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.565019 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:32Z","lastTransitionTime":"2026-01-30T21:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.667172 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.667229 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.667246 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.667269 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.667286 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:32Z","lastTransitionTime":"2026-01-30T21:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.769644 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.769700 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.769758 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.769780 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.769797 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:32Z","lastTransitionTime":"2026-01-30T21:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.792094 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 02:31:42.118344105 +0000 UTC Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.817428 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:32 crc kubenswrapper[4914]: E0130 21:15:32.817731 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.872658 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.872694 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.872722 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.872740 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.872751 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:32Z","lastTransitionTime":"2026-01-30T21:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.974975 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.975030 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.975047 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.975070 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:32 crc kubenswrapper[4914]: I0130 21:15:32.975088 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:32Z","lastTransitionTime":"2026-01-30T21:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.078103 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.078143 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.078155 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.078172 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.078184 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:33Z","lastTransitionTime":"2026-01-30T21:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.179875 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.179910 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.179918 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.179931 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.179940 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:33Z","lastTransitionTime":"2026-01-30T21:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.281270 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.281296 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.281303 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.281316 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.281324 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:33Z","lastTransitionTime":"2026-01-30T21:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.383556 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.383584 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.383592 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.383604 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.383613 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:33Z","lastTransitionTime":"2026-01-30T21:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.485938 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.485970 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.485980 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.485996 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.486007 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:33Z","lastTransitionTime":"2026-01-30T21:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.588484 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.588543 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.588561 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.588586 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.588604 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:33Z","lastTransitionTime":"2026-01-30T21:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.691114 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.691160 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.691173 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.691190 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.691201 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:33Z","lastTransitionTime":"2026-01-30T21:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.792300 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 05:32:06.484789449 +0000 UTC Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.793698 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.793778 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.793800 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.793827 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.793851 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:33Z","lastTransitionTime":"2026-01-30T21:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.817255 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.817332 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:33 crc kubenswrapper[4914]: E0130 21:15:33.817418 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.817450 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:33 crc kubenswrapper[4914]: E0130 21:15:33.817540 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:33 crc kubenswrapper[4914]: E0130 21:15:33.817727 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.897281 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.897350 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.897369 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.897396 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:33 crc kubenswrapper[4914]: I0130 21:15:33.897416 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:33Z","lastTransitionTime":"2026-01-30T21:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.000931 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.001000 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.001021 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.001049 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.001071 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:34Z","lastTransitionTime":"2026-01-30T21:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.105204 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.105269 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.105288 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.105313 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.105331 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:34Z","lastTransitionTime":"2026-01-30T21:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.209512 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.209575 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.209595 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.209626 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.209651 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:34Z","lastTransitionTime":"2026-01-30T21:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.313440 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.313508 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.313522 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.313542 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.313555 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:34Z","lastTransitionTime":"2026-01-30T21:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.417109 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.417167 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.417181 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.417202 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.417215 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:34Z","lastTransitionTime":"2026-01-30T21:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.520769 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.520829 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.520844 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.520865 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.520879 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:34Z","lastTransitionTime":"2026-01-30T21:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.623972 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.624014 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.624023 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.624040 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.624050 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:34Z","lastTransitionTime":"2026-01-30T21:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.727025 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.727091 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.727113 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.727142 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.727164 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:34Z","lastTransitionTime":"2026-01-30T21:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.793316 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 03:45:35.890999055 +0000 UTC Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.817976 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:34 crc kubenswrapper[4914]: E0130 21:15:34.818205 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.829790 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.829829 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.829838 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.829853 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.829865 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:34Z","lastTransitionTime":"2026-01-30T21:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.932360 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.932409 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.932422 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.932443 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:34 crc kubenswrapper[4914]: I0130 21:15:34.932456 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:34Z","lastTransitionTime":"2026-01-30T21:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.034858 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.034905 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.034922 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.034939 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.034951 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:35Z","lastTransitionTime":"2026-01-30T21:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.137658 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.137685 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.137693 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.137723 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.137733 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:35Z","lastTransitionTime":"2026-01-30T21:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.239675 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.239736 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.239749 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.239765 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.239775 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:35Z","lastTransitionTime":"2026-01-30T21:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.342271 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.342302 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.342311 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.342326 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.342336 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:35Z","lastTransitionTime":"2026-01-30T21:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.448570 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.448607 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.448615 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.448631 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.448641 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:35Z","lastTransitionTime":"2026-01-30T21:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.551257 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.551288 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.551297 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.551310 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.551319 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:35Z","lastTransitionTime":"2026-01-30T21:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.653745 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.653826 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.653853 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.653885 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.653906 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:35Z","lastTransitionTime":"2026-01-30T21:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.756287 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.756329 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.756337 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.756352 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.756362 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:35Z","lastTransitionTime":"2026-01-30T21:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.793912 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 08:01:42.711126933 +0000 UTC Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.817331 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.817426 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.817503 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:35 crc kubenswrapper[4914]: E0130 21:15:35.817687 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:35 crc kubenswrapper[4914]: E0130 21:15:35.817854 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:35 crc kubenswrapper[4914]: E0130 21:15:35.818050 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.858531 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.858565 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.858574 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.858587 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.858596 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:35Z","lastTransitionTime":"2026-01-30T21:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.960395 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.960447 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.960465 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.960487 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:35 crc kubenswrapper[4914]: I0130 21:15:35.960505 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:35Z","lastTransitionTime":"2026-01-30T21:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.063817 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.063853 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.063864 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.063878 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.063886 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:36Z","lastTransitionTime":"2026-01-30T21:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.166790 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.166848 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.166867 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.166890 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.166908 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:36Z","lastTransitionTime":"2026-01-30T21:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.269523 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.269939 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.269979 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.270014 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.270041 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:36Z","lastTransitionTime":"2026-01-30T21:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.282077 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvbd7_c1067fc5-9bff-4a81-982f-b2cca1c432d0/kube-multus/0.log" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.282147 4914 generic.go:334] "Generic (PLEG): container finished" podID="c1067fc5-9bff-4a81-982f-b2cca1c432d0" containerID="ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b" exitCode=1 Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.282185 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvbd7" event={"ID":"c1067fc5-9bff-4a81-982f-b2cca1c432d0","Type":"ContainerDied","Data":"ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b"} Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.282766 4914 scope.go:117] "RemoveContainer" containerID="ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.299296 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.316590 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.334450 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.353047 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.369056 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.373205 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.373241 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.373255 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.373276 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.373293 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:36Z","lastTransitionTime":"2026-01-30T21:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.399832 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.418076 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.432524 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0db45423-3fbc-4398-8b7f-4ca6dc5c26ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8759bf42864facd5f47819968351923fb2c65ccf597f6cf9ff7c60d9e3b036e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1aff6c5242fbe5e0d1e6c200c68b781af18f97900c3464e114bebb27a500f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baf15cc7ec88d6b3f78cf54a42cef4f7082519e8256fdade2d4882cd4a879f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.447875 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.464086 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.475841 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.475869 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.475878 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.475892 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.475901 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:36Z","lastTransitionTime":"2026-01-30T21:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.481149 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.496978 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:35Z\\\",\\\"message\\\":\\\"2026-01-30T21:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ec91c23-a8c7-4252-9044-dca13187367f\\\\n2026-01-30T21:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ec91c23-a8c7-4252-9044-dca13187367f to /host/opt/cni/bin/\\\\n2026-01-30T21:14:50Z [verbose] multus-daemon started\\\\n2026-01-30T21:14:50Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.525839 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"d/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849198 6572 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849491 6572 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849603 6572 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849611 6572 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849912 6572 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.850352 6572 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.850741 6572 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:15:18.850812 6572 factory.go:656] Stopping watch factory\\\\nI0130 21:15:18.850833 6572 ovnkube.go:599] Stopped ovnkube\\\\nI0130 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.547835 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.561074 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.577972 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.578031 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.578048 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.578073 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.578091 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:36Z","lastTransitionTime":"2026-01-30T21:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.583745 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.596637 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.611750 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.680289 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.680537 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.680603 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.680681 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.680767 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:36Z","lastTransitionTime":"2026-01-30T21:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.783761 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.784003 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.784091 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.784185 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.784279 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:36Z","lastTransitionTime":"2026-01-30T21:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.795096 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 05:52:23.5174896 +0000 UTC Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.817543 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:36 crc kubenswrapper[4914]: E0130 21:15:36.817759 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.887183 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.887227 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.887239 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.887256 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.887273 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:36Z","lastTransitionTime":"2026-01-30T21:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.990175 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.990222 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.990234 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.990252 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:36 crc kubenswrapper[4914]: I0130 21:15:36.990264 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:36Z","lastTransitionTime":"2026-01-30T21:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.092985 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.093038 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.093054 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.093080 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.093101 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:37Z","lastTransitionTime":"2026-01-30T21:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.196286 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.196316 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.196324 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.196339 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.196349 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:37Z","lastTransitionTime":"2026-01-30T21:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.288100 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvbd7_c1067fc5-9bff-4a81-982f-b2cca1c432d0/kube-multus/0.log" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.288142 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvbd7" event={"ID":"c1067fc5-9bff-4a81-982f-b2cca1c432d0","Type":"ContainerStarted","Data":"556e77646daeedff4e7f95f018b7c7bec78863ade5c39385eb31ec26341e4d7d"} Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.298630 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.298652 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.298660 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.298672 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.298680 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:37Z","lastTransitionTime":"2026-01-30T21:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.309110 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.324109 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.342957 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.359548 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.374284 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.391699 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.400897 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.400952 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.400970 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.400994 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.401014 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:37Z","lastTransitionTime":"2026-01-30T21:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.411924 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://556e77646daeedff4e7f95f018b7c7bec78863ade5c39385eb31ec26341e4d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:35Z\\\",\\\"message\\\":\\\"2026-01-30T21:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ec91c23-a8c7-4252-9044-dca13187367f\\\\n2026-01-30T21:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ec91c23-a8c7-4252-9044-dca13187367f to /host/opt/cni/bin/\\\\n2026-01-30T21:14:50Z [verbose] multus-daemon started\\\\n2026-01-30T21:14:50Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.442622 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"d/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849198 6572 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849491 6572 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849603 6572 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849611 6572 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849912 6572 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.850352 6572 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.850741 6572 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:15:18.850812 6572 factory.go:656] Stopping watch factory\\\\nI0130 21:15:18.850833 6572 ovnkube.go:599] Stopped ovnkube\\\\nI0130 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.476234 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.498077 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.503170 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.503204 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.503216 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.503234 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.503245 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:37Z","lastTransitionTime":"2026-01-30T21:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.516416 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0db45423-3fbc-4398-8b7f-4ca6dc5c26ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8759bf42864facd5f47819968351923fb2c65ccf597f6cf9ff7c60d9e3b036e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1aff6c5242fbe5e0d1e6c200c68b781af18f97900c3464e114bebb27a500f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baf15cc7ec88d6b3f78cf54a42cef4f7082519e8256fdade2d4882cd4a879f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.539258 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.558360 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.581625 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.598151 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.605354 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.605390 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.605401 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.605416 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.605425 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:37Z","lastTransitionTime":"2026-01-30T21:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.621769 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.635470 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.650679 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.708044 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.708072 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.708083 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.708098 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.708111 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:37Z","lastTransitionTime":"2026-01-30T21:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.796001 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 13:23:04.790186184 +0000 UTC Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.810845 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.810882 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.810893 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.810909 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.810920 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:37Z","lastTransitionTime":"2026-01-30T21:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.817076 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:37 crc kubenswrapper[4914]: E0130 21:15:37.817169 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.817207 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.817218 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:37 crc kubenswrapper[4914]: E0130 21:15:37.817377 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:37 crc kubenswrapper[4914]: E0130 21:15:37.817466 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.832482 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.845260 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.862778 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://556e77646daeedff4e7f95f018b7c7bec78863ade5c39385eb31ec26341e4d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:35Z\\\",\\\"message\\\":\\\"2026-01-30T21:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ec91c23-a8c7-4252-9044-dca13187367f\\\\n2026-01-30T21:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ec91c23-a8c7-4252-9044-dca13187367f to /host/opt/cni/bin/\\\\n2026-01-30T21:14:50Z [verbose] multus-daemon started\\\\n2026-01-30T21:14:50Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.889733 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"d/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849198 6572 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849491 6572 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849603 6572 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849611 6572 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849912 6572 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.850352 6572 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.850741 6572 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:15:18.850812 6572 factory.go:656] Stopping watch factory\\\\nI0130 21:15:18.850833 6572 ovnkube.go:599] Stopped ovnkube\\\\nI0130 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.912738 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.912775 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.912786 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.912803 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.912815 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:37Z","lastTransitionTime":"2026-01-30T21:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.918349 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.937621 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.954481 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0db45423-3fbc-4398-8b7f-4ca6dc5c26ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8759bf42864facd5f47819968351923fb2c65ccf597f6cf9ff7c60d9e3b036e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1aff6c5242fbe5e0d1e6c200c68b781af18f97900c3464e114bebb27a500f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baf15cc7ec88d6b3f78cf54a42cef4f7082519e8256fdade2d4882cd4a879f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.970844 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:37 crc kubenswrapper[4914]: I0130 21:15:37.991464 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.005321 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.017978 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.018036 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.018048 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.018065 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.018077 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.023565 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.036341 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.053011 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.071337 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.089658 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.103066 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.119233 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.121054 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.121089 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.121100 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.121115 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.121126 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.133691 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.223489 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.223545 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.223562 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.223587 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.223604 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.326111 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.326147 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.326156 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.326169 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.326181 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.428941 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.429004 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.429030 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.429061 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.429083 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.532310 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.532381 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.532409 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.532438 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.532460 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.635669 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.635773 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.635799 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.635832 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.635854 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.655932 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.655998 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.656020 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.656044 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.656062 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:38 crc kubenswrapper[4914]: E0130 21:15:38.673193 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.677428 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.677463 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.677472 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.677485 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.677494 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:38 crc kubenswrapper[4914]: E0130 21:15:38.693186 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.697462 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.697496 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.697506 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.697517 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.697526 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:38 crc kubenswrapper[4914]: E0130 21:15:38.712117 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.716167 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.716199 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.716211 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.716229 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.716240 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:38 crc kubenswrapper[4914]: E0130 21:15:38.735545 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.739149 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.739175 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.739205 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.739222 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.739234 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:38 crc kubenswrapper[4914]: E0130 21:15:38.751546 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:38 crc kubenswrapper[4914]: E0130 21:15:38.751743 4914 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.753555 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.753600 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.753623 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.753651 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.753675 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.796158 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 03:34:07.356248249 +0000 UTC Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.817487 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:38 crc kubenswrapper[4914]: E0130 21:15:38.817598 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.862565 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.862621 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.862641 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.862665 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.862683 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.964982 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.965024 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.965035 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.965050 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:38 crc kubenswrapper[4914]: I0130 21:15:38.965060 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:38Z","lastTransitionTime":"2026-01-30T21:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.067671 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.067756 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.067774 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.067798 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.067816 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:39Z","lastTransitionTime":"2026-01-30T21:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.169909 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.169939 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.169950 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.169965 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.169977 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:39Z","lastTransitionTime":"2026-01-30T21:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.272313 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.272345 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.272353 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.272366 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.272375 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:39Z","lastTransitionTime":"2026-01-30T21:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.375057 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.375091 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.375101 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.375117 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.375126 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:39Z","lastTransitionTime":"2026-01-30T21:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.477599 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.477641 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.477651 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.477667 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.477678 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:39Z","lastTransitionTime":"2026-01-30T21:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.580409 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.580483 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.580502 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.580527 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.580546 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:39Z","lastTransitionTime":"2026-01-30T21:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.683164 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.683217 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.683236 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.683260 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.683277 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:39Z","lastTransitionTime":"2026-01-30T21:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.785472 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.785519 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.785531 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.785548 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.785559 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:39Z","lastTransitionTime":"2026-01-30T21:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.796777 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 12:54:16.105398761 +0000 UTC Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.817192 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.817217 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.817261 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:39 crc kubenswrapper[4914]: E0130 21:15:39.817369 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:39 crc kubenswrapper[4914]: E0130 21:15:39.817502 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:39 crc kubenswrapper[4914]: E0130 21:15:39.817562 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.888041 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.888072 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.888082 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.888099 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.888110 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:39Z","lastTransitionTime":"2026-01-30T21:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.990255 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.990294 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.990308 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.990323 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:39 crc kubenswrapper[4914]: I0130 21:15:39.990333 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:39Z","lastTransitionTime":"2026-01-30T21:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.092728 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.092786 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.092810 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.092837 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.092859 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:40Z","lastTransitionTime":"2026-01-30T21:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.195278 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.195315 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.195329 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.195347 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.195358 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:40Z","lastTransitionTime":"2026-01-30T21:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.296822 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.296875 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.296899 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.296928 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.296953 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:40Z","lastTransitionTime":"2026-01-30T21:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.399752 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.399805 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.399824 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.399848 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.399866 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:40Z","lastTransitionTime":"2026-01-30T21:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.502679 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.502746 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.502777 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.502796 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.502809 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:40Z","lastTransitionTime":"2026-01-30T21:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.605994 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.606035 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.606044 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.606059 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.606068 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:40Z","lastTransitionTime":"2026-01-30T21:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.709050 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.709118 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.709141 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.709173 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.709192 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:40Z","lastTransitionTime":"2026-01-30T21:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.797291 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 00:03:16.816705716 +0000 UTC Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.811534 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.811595 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.811613 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.811639 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.811659 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:40Z","lastTransitionTime":"2026-01-30T21:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.817848 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:40 crc kubenswrapper[4914]: E0130 21:15:40.818021 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.914286 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.914336 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.914352 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.914373 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:40 crc kubenswrapper[4914]: I0130 21:15:40.914390 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:40Z","lastTransitionTime":"2026-01-30T21:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.016887 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.016993 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.017571 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.017630 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.017649 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:41Z","lastTransitionTime":"2026-01-30T21:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.120446 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.120507 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.120523 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.120548 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.120568 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:41Z","lastTransitionTime":"2026-01-30T21:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.223222 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.223282 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.223318 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.223342 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.223362 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:41Z","lastTransitionTime":"2026-01-30T21:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.326970 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.327396 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.327565 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.327750 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.327941 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:41Z","lastTransitionTime":"2026-01-30T21:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.432081 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.432132 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.432149 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.432173 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.432191 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:41Z","lastTransitionTime":"2026-01-30T21:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.536425 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.536498 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.536514 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.536534 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.536552 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:41Z","lastTransitionTime":"2026-01-30T21:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.640207 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.640253 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.640261 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.640277 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.640287 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:41Z","lastTransitionTime":"2026-01-30T21:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.743888 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.743954 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.743971 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.744002 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.744025 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:41Z","lastTransitionTime":"2026-01-30T21:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.797795 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 09:34:07.784616569 +0000 UTC Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.817609 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.817674 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.817846 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:41 crc kubenswrapper[4914]: E0130 21:15:41.818258 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:41 crc kubenswrapper[4914]: E0130 21:15:41.818415 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:41 crc kubenswrapper[4914]: E0130 21:15:41.818611 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.831873 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.847859 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.847933 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.847952 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.848355 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.848412 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:41Z","lastTransitionTime":"2026-01-30T21:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.951051 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.951119 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.951137 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.951164 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:41 crc kubenswrapper[4914]: I0130 21:15:41.951184 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:41Z","lastTransitionTime":"2026-01-30T21:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.054031 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.054086 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.054104 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.054127 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.054145 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:42Z","lastTransitionTime":"2026-01-30T21:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.157816 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.157873 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.157890 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.157915 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.157932 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:42Z","lastTransitionTime":"2026-01-30T21:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.260498 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.260546 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.260566 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.260596 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.260618 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:42Z","lastTransitionTime":"2026-01-30T21:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.363803 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.363880 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.363905 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.363934 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.363957 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:42Z","lastTransitionTime":"2026-01-30T21:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.467281 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.467361 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.467385 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.467412 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.467436 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:42Z","lastTransitionTime":"2026-01-30T21:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.569977 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.570021 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.570037 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.570057 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.570072 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:42Z","lastTransitionTime":"2026-01-30T21:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.672999 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.673048 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.673064 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.673088 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.673107 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:42Z","lastTransitionTime":"2026-01-30T21:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.776324 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.776399 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.776434 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.776467 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.776489 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:42Z","lastTransitionTime":"2026-01-30T21:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.798734 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 00:15:11.808180412 +0000 UTC Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.817539 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:42 crc kubenswrapper[4914]: E0130 21:15:42.817815 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.878962 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.879026 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.879050 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.879081 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.879101 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:42Z","lastTransitionTime":"2026-01-30T21:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.982073 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.982114 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.982127 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.982145 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:42 crc kubenswrapper[4914]: I0130 21:15:42.982158 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:42Z","lastTransitionTime":"2026-01-30T21:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.087007 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.087069 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.087091 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.087120 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.087140 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:43Z","lastTransitionTime":"2026-01-30T21:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.190267 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.190313 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.190327 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.190345 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.190381 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:43Z","lastTransitionTime":"2026-01-30T21:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.293927 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.293971 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.293984 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.294003 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.294017 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:43Z","lastTransitionTime":"2026-01-30T21:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.396764 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.396827 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.396851 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.396883 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.396909 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:43Z","lastTransitionTime":"2026-01-30T21:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.499899 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.499953 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.499970 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.499996 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.500015 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:43Z","lastTransitionTime":"2026-01-30T21:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.603171 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.603225 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.603242 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.603265 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.603281 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:43Z","lastTransitionTime":"2026-01-30T21:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.706494 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.706533 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.706544 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.706561 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.706574 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:43Z","lastTransitionTime":"2026-01-30T21:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.799257 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 01:49:32.526786394 +0000 UTC Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.809349 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.809398 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.809409 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.809428 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.809444 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:43Z","lastTransitionTime":"2026-01-30T21:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.817920 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.817927 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:43 crc kubenswrapper[4914]: E0130 21:15:43.818126 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.818350 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:43 crc kubenswrapper[4914]: E0130 21:15:43.818392 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:43 crc kubenswrapper[4914]: E0130 21:15:43.818513 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.913432 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.913473 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.913486 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.913500 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:43 crc kubenswrapper[4914]: I0130 21:15:43.913511 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:43Z","lastTransitionTime":"2026-01-30T21:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.016331 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.016379 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.016391 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.016406 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.016419 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:44Z","lastTransitionTime":"2026-01-30T21:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.118998 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.119029 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.119037 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.119049 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.119060 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:44Z","lastTransitionTime":"2026-01-30T21:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.221578 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.221646 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.221667 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.221697 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.221760 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:44Z","lastTransitionTime":"2026-01-30T21:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.324646 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.324750 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.324777 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.324808 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.324832 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:44Z","lastTransitionTime":"2026-01-30T21:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.427536 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.427593 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.427610 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.427634 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.427652 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:44Z","lastTransitionTime":"2026-01-30T21:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.531099 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.531169 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.531189 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.531217 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.531237 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:44Z","lastTransitionTime":"2026-01-30T21:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.634071 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.634128 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.634147 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.634170 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.634188 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:44Z","lastTransitionTime":"2026-01-30T21:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.737186 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.737261 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.737282 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.737311 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.737333 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:44Z","lastTransitionTime":"2026-01-30T21:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.800417 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 15:53:10.034816859 +0000 UTC Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.817792 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:44 crc kubenswrapper[4914]: E0130 21:15:44.817917 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.819392 4914 scope.go:117] "RemoveContainer" containerID="58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.840280 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.840342 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.840359 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.840385 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.840403 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:44Z","lastTransitionTime":"2026-01-30T21:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.943388 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.943765 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.943777 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.943793 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:44 crc kubenswrapper[4914]: I0130 21:15:44.943805 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:44Z","lastTransitionTime":"2026-01-30T21:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.047185 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.047247 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.047264 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.047289 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.047308 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:45Z","lastTransitionTime":"2026-01-30T21:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.151163 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.151242 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.151266 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.151302 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.151328 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:45Z","lastTransitionTime":"2026-01-30T21:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.254762 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.254809 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.254827 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.254852 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.254870 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:45Z","lastTransitionTime":"2026-01-30T21:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.316158 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/2.log" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.320377 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerStarted","Data":"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0"} Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.321100 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.353434 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"d/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849198 6572 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849491 6572 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849603 6572 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849611 6572 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849912 6572 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.850352 6572 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.850741 6572 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:15:18.850812 6572 factory.go:656] Stopping watch factory\\\\nI0130 21:15:18.850833 6572 ovnkube.go:599] Stopped ovnkube\\\\nI0130 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.357177 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.357205 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.357215 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.357233 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.357250 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:45Z","lastTransitionTime":"2026-01-30T21:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.378666 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.396833 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.418385 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0db45423-3fbc-4398-8b7f-4ca6dc5c26ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8759bf42864facd5f47819968351923fb2c65ccf597f6cf9ff7c60d9e3b036e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1aff6c5242fbe5e0d1e6c200c68b781af18f97900c3464e114bebb27a500f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baf15cc7ec88d6b3f78cf54a42cef4f7082519e8256fdade2d4882cd4a879f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.441939 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.459778 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.459839 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.459859 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.459888 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.459906 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:45Z","lastTransitionTime":"2026-01-30T21:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.460813 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.480202 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.504572 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://556e77646daeedff4e7f95f018b7c7bec78863ade5c39385eb31ec26341e4d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:35Z\\\",\\\"message\\\":\\\"2026-01-30T21:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ec91c23-a8c7-4252-9044-dca13187367f\\\\n2026-01-30T21:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ec91c23-a8c7-4252-9044-dca13187367f to /host/opt/cni/bin/\\\\n2026-01-30T21:14:50Z [verbose] multus-daemon started\\\\n2026-01-30T21:14:50Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.530487 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.546851 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.561864 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.561916 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.561934 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.561958 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.561975 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:45Z","lastTransitionTime":"2026-01-30T21:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.564594 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.575725 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.598857 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.610016 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fdcb1ee-3a89-4e05-b691-ab2c9540e07a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f74a072a31d862caee808486ce40398646a7edd7a44143f51258af9e3619be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://801e1d86b518b18af908f48c442135881fee4749371a2d50f5232a4eb9a4eb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801e1d86b518b18af908f48c442135881fee4749371a2d50f5232a4eb9a4eb62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.633516 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.651207 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.663838 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.663883 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.663898 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.663916 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.663929 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:45Z","lastTransitionTime":"2026-01-30T21:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.665753 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.680887 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.693745 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.766814 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.766853 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.766865 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.766881 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.766892 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:45Z","lastTransitionTime":"2026-01-30T21:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.801257 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 15:53:59.57956664 +0000 UTC Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.817634 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.817671 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:45 crc kubenswrapper[4914]: E0130 21:15:45.817882 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.817980 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:45 crc kubenswrapper[4914]: E0130 21:15:45.818029 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:45 crc kubenswrapper[4914]: E0130 21:15:45.818172 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.869018 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.869120 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.869145 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.869174 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.869195 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:45Z","lastTransitionTime":"2026-01-30T21:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.971643 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.971699 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.971755 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.971782 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:45 crc kubenswrapper[4914]: I0130 21:15:45.971804 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:45Z","lastTransitionTime":"2026-01-30T21:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.074889 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.074992 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.075013 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.075039 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.075063 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:46Z","lastTransitionTime":"2026-01-30T21:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.178126 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.178202 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.178239 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.178274 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.178295 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:46Z","lastTransitionTime":"2026-01-30T21:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.281209 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.281277 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.281301 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.281331 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.281354 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:46Z","lastTransitionTime":"2026-01-30T21:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.325596 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/3.log" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.326442 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/2.log" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.334995 4914 generic.go:334] "Generic (PLEG): container finished" podID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerID="11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0" exitCode=1 Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.335039 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerDied","Data":"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0"} Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.335086 4914 scope.go:117] "RemoveContainer" containerID="58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.335943 4914 scope.go:117] "RemoveContainer" containerID="11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0" Jan 30 21:15:46 crc kubenswrapper[4914]: E0130 21:15:46.336166 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.350064 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.364258 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.380682 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.383482 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.383545 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.383563 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.383642 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.383668 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:46Z","lastTransitionTime":"2026-01-30T21:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.395537 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.406810 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0db45423-3fbc-4398-8b7f-4ca6dc5c26ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8759bf42864facd5f47819968351923fb2c65ccf597f6cf9ff7c60d9e3b036e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1aff6c5242fbe5e0d1e6c200c68b781af18f97900c3464e114bebb27a500f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baf15cc7ec88d6b3f78cf54a42cef4f7082519e8256fdade2d4882cd4a879f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.423645 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.440823 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.456540 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.477966 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://556e77646daeedff4e7f95f018b7c7bec78863ade5c39385eb31ec26341e4d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:35Z\\\",\\\"message\\\":\\\"2026-01-30T21:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ec91c23-a8c7-4252-9044-dca13187367f\\\\n2026-01-30T21:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ec91c23-a8c7-4252-9044-dca13187367f to /host/opt/cni/bin/\\\\n2026-01-30T21:14:50Z [verbose] multus-daemon started\\\\n2026-01-30T21:14:50Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.488347 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.488390 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.488400 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.488416 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.488428 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:46Z","lastTransitionTime":"2026-01-30T21:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.506030 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58c7e7a1f2b0e3daa5fd97bfaee8cea44661699b7ff442e54c3eaf10bd79a3e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:18Z\\\",\\\"message\\\":\\\"d/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849198 6572 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849491 6572 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849603 6572 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.849611 6572 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.849912 6572 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:15:18.850352 6572 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:15:18.850741 6572 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:15:18.850812 6572 factory.go:656] Stopping watch factory\\\\nI0130 21:15:18.850833 6572 ovnkube.go:599] Stopped ovnkube\\\\nI0130 2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:46Z\\\",\\\"message\\\":\\\" 6968 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0130 21:15:45.926303 6968 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-zxtk5\\\\nI0130 21:15:45.926147 6968 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nF0130 21:15:45.926320 6968 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z]\\\\nI0130 21:15:45.926327 6968 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0130 21:15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.534135 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.548475 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.569869 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.586968 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.591115 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.591163 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.591180 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.591205 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.591222 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:46Z","lastTransitionTime":"2026-01-30T21:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.603252 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.622271 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.638763 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.650971 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fdcb1ee-3a89-4e05-b691-ab2c9540e07a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f74a072a31d862caee808486ce40398646a7edd7a44143f51258af9e3619be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://801e1d86b518b18af908f48c442135881fee4749371a2d50f5232a4eb9a4eb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801e1d86b518b18af908f48c442135881fee4749371a2d50f5232a4eb9a4eb62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.667761 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.693928 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.693976 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.693991 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.694012 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.694029 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:46Z","lastTransitionTime":"2026-01-30T21:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.796847 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.796893 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.796909 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.796930 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.796945 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:46Z","lastTransitionTime":"2026-01-30T21:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.802113 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 08:27:19.296532645 +0000 UTC Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.817512 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:46 crc kubenswrapper[4914]: E0130 21:15:46.817754 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.905344 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.905411 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.905435 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.905480 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:46 crc kubenswrapper[4914]: I0130 21:15:46.905505 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:46Z","lastTransitionTime":"2026-01-30T21:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.008927 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.008985 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.009002 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.009028 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.009046 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:47Z","lastTransitionTime":"2026-01-30T21:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.112185 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.112263 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.112297 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.112329 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.112352 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:47Z","lastTransitionTime":"2026-01-30T21:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.215272 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.215344 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.215367 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.215398 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.215421 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:47Z","lastTransitionTime":"2026-01-30T21:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.318562 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.318639 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.318664 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.318694 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.318752 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:47Z","lastTransitionTime":"2026-01-30T21:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.341163 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/3.log" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.345882 4914 scope.go:117] "RemoveContainer" containerID="11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0" Jan 30 21:15:47 crc kubenswrapper[4914]: E0130 21:15:47.346137 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.365349 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.384146 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.421593 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.421651 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.421670 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.421697 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.421747 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:47Z","lastTransitionTime":"2026-01-30T21:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.430832 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.450533 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.465673 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.477785 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.488973 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://556e77646daeedff4e7f95f018b7c7bec78863ade5c39385eb31ec26341e4d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:35Z\\\",\\\"message\\\":\\\"2026-01-30T21:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ec91c23-a8c7-4252-9044-dca13187367f\\\\n2026-01-30T21:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ec91c23-a8c7-4252-9044-dca13187367f to /host/opt/cni/bin/\\\\n2026-01-30T21:14:50Z [verbose] multus-daemon started\\\\n2026-01-30T21:14:50Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.507381 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:46Z\\\",\\\"message\\\":\\\" 6968 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0130 21:15:45.926303 6968 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-zxtk5\\\\nI0130 21:15:45.926147 6968 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nF0130 21:15:45.926320 6968 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z]\\\\nI0130 21:15:45.926327 6968 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0130 21:15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.523911 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.523955 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.523972 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.523990 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.524002 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:47Z","lastTransitionTime":"2026-01-30T21:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.526683 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.539060 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.551315 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0db45423-3fbc-4398-8b7f-4ca6dc5c26ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8759bf42864facd5f47819968351923fb2c65ccf597f6cf9ff7c60d9e3b036e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1aff6c5242fbe5e0d1e6c200c68b781af18f97900c3464e114bebb27a500f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baf15cc7ec88d6b3f78cf54a42cef4f7082519e8256fdade2d4882cd4a879f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.568387 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.583287 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.593098 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.609578 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.623855 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.626892 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.626935 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.626950 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.626972 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.626988 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:47Z","lastTransitionTime":"2026-01-30T21:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.640399 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.652793 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fdcb1ee-3a89-4e05-b691-ab2c9540e07a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f74a072a31d862caee808486ce40398646a7edd7a44143f51258af9e3619be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://801e1d86b518b18af908f48c442135881fee4749371a2d50f5232a4eb9a4eb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801e1d86b518b18af908f48c442135881fee4749371a2d50f5232a4eb9a4eb62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.673597 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.728993 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.729223 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.729297 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.729378 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.729460 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:47Z","lastTransitionTime":"2026-01-30T21:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.802743 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 06:28:30.724304275 +0000 UTC Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.817326 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.817383 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.817336 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:47 crc kubenswrapper[4914]: E0130 21:15:47.817454 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:47 crc kubenswrapper[4914]: E0130 21:15:47.817570 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:47 crc kubenswrapper[4914]: E0130 21:15:47.817838 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.831238 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.831301 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.831322 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.831346 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.831369 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:47Z","lastTransitionTime":"2026-01-30T21:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.833947 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3be0c366-7d83-42e6-9a85-3f77ce72281f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6333e80d14bbe0febf4fd9c246e124b8dbc5a38825a0f6785290f72719721823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wmmsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pm2tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.849263 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7xn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c99cec6-435b-4912-b6e5-eb42cf23adfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014cf07b07615d3cd08c6a2f75b39ecf3668ae02178a47cf84a151e02d4f89d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j5xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7xn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.868897 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.884800 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a9fae26366cf7800efcfb00d782ee1d2bc65e8918e48a5fa665e5d02120ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.903406 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0db45423-3fbc-4398-8b7f-4ca6dc5c26ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8759bf42864facd5f47819968351923fb2c65ccf597f6cf9ff7c60d9e3b036e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1aff6c5242fbe5e0d1e6c200c68b781af18f97900c3464e114bebb27a500f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://baf15cc7ec88d6b3f78cf54a42cef4f7082519e8256fdade2d4882cd4a879f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c98e5e63721b38f5b14718b44a9dca49a5438a00725da12d9e22e757692f735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.927854 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25c4cd6783ef93c293e7b8419400626f3b67188731565cfa04905e181c9c7475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93caf81580e035ef415cd2ad95bf5bf5baf4986a771312946e9668d77dd1b289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.932889 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.932928 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.932938 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.932957 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.932972 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:47Z","lastTransitionTime":"2026-01-30T21:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.950958 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.973248 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:47 crc kubenswrapper[4914]: I0130 21:15:47.994645 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvbd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1067fc5-9bff-4a81-982f-b2cca1c432d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://556e77646daeedff4e7f95f018b7c7bec78863ade5c39385eb31ec26341e4d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:35Z\\\",\\\"message\\\":\\\"2026-01-30T21:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8ec91c23-a8c7-4252-9044-dca13187367f\\\\n2026-01-30T21:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8ec91c23-a8c7-4252-9044-dca13187367f to /host/opt/cni/bin/\\\\n2026-01-30T21:14:50Z [verbose] multus-daemon started\\\\n2026-01-30T21:14:50Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tpkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvbd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:47Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.029199 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a32fa1f-f3a9-4e60-b665-51138c3ce768\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:15:46Z\\\",\\\"message\\\":\\\" 6968 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0130 21:15:45.926303 6968 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-zxtk5\\\\nI0130 21:15:45.926147 6968 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nF0130 21:15:45.926320 6968 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:45Z is after 2025-08-24T17:21:41Z]\\\\nI0130 21:15:45.926327 6968 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0130 21:15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r27rl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hchqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.035020 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.035051 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.035066 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.035089 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.035105 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:48Z","lastTransitionTime":"2026-01-30T21:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.052746 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"174c4eb7-8e56-4a3d-a78d-75f22b36701c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e195ccb70d043073f5840d1ebf9129aadda6a9222ad6d09b30f0ea7ad00a65e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af37c97cb9eeb0f28106f4383d12e60cfa292de43461e1c145620894b0963711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6a21e089a434af33b0fd5ac99d60b8c43c1629899a0fa5b800d61536b1a28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d21ae9fd306386504abf4f3a98cb19822ff72b6274ceac868ab387103e6b4958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593ac91636b684dc5c2f5c5e098ef8244dc52006ee936103271da2de7e9abee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1349cd5e26b4fe0a3fa72055e8858a845fb239df2d95499567640ecdfbd2e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3ff0a0b9d0c03e6acfec84fb1283a10d85bbfd4cab3c9af461a55f4b533a468\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf0413dd6c066e33fcda10dfe76555b8e1dee3e1cb2ce6d9c9e47fee601912ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.110400 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98fe0ea-1856-4645-8a0a-54e481990853\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a2ae9b2f1a08d350d318983a851129061d7386870a22dbc5b9d37696e12acf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0df03f0dc0efd96423db060ae12de8e43a590a35fbdc2512a971c42be53ed0f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20bdf77ba963ea3ce8a1a0c417b3b9a65fb55de691ff3c692c0f665db4537aac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.133005 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4cae306-d133-4f6b-b5f7-c86a8cf6fd11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74c09f51d96bb17f345247d93279c9b935b36a05ac529f416ddb5872263a90a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ba85add62ce511a6181adc30aa2a56135cd23849ed8ed27929c6173c3653a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2fe7464b29f11f0328a9ca9851e8035b07f3645fafb950459359ae12569e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024ff0da5479602aa86ae801887d0268304adfb9dd0221ffb02a082a4450953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ec5dab5b660aafc01a27a3495804a815317b0969db5c5f57767d476f757d3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c165affcd980f7fae3aef06dbbd8c6170089d3d882db91b889bb853ee8eda459\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0316aefe46391f187a749dfdab2903870c22ad6e7ed04a2f816d0016df610699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2h4k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wt7n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.138366 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.138416 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.138433 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.138503 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.138531 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:48Z","lastTransitionTime":"2026-01-30T21:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.150619 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c2klk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a911963-1d06-47d0-8f70-d81d5bd47496\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nfmb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c2klk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.168750 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c287caa9-36a4-4d1f-9799-0fda91a8c8d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6290627e18442c113a406f65209ebdfcba1bb33e7c5a68b91627ce221f637ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f73b0ec9a8fa4b6117ce28f4e470b98d30119c8d2e49dbe9b4db7c20ebd631bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flg8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z2dvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.188686 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d5ab23e727ba0c6b1c5a6f0bb6e9c6381051589ec2f362c912e583d58cac73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.204037 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zxtk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e82ab6e-8068-438b-9caa-f3d7028cbb5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cee68c2b31e678c79f792d7a73707454b9068da5f714e1e39b65537bb18c4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v84mz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zxtk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.220236 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fdcb1ee-3a89-4e05-b691-ab2c9540e07a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f74a072a31d862caee808486ce40398646a7edd7a44143f51258af9e3619be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://801e1d86b518b18af908f48c442135881fee4749371a2d50f5232a4eb9a4eb62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801e1d86b518b18af908f48c442135881fee4749371a2d50f5232a4eb9a4eb62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.238165 4914 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:14:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0130 21:14:41.293135 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 21:14:41.294660 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3559384360/tls.crt::/tmp/serving-cert-3559384360/tls.key\\\\\\\"\\\\nI0130 21:14:47.341501 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0130 21:14:47.345517 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0130 21:14:47.345548 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0130 21:14:47.345583 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0130 21:14:47.345591 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0130 21:14:47.363703 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0130 21:14:47.363757 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363765 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0130 21:14:47.363773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0130 21:14:47.363778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0130 21:14:47.363809 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nI0130 21:14:47.363795 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0130 21:14:47.363814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0130 21:14:47.369467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:14:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:14:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.241145 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.241384 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.241526 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.241661 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.241859 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:48Z","lastTransitionTime":"2026-01-30T21:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.345083 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.345204 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.345229 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.345294 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.345314 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:48Z","lastTransitionTime":"2026-01-30T21:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.449234 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.449353 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.449372 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.449397 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.449416 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:48Z","lastTransitionTime":"2026-01-30T21:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.552885 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.552941 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.552965 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.553032 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.553056 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:48Z","lastTransitionTime":"2026-01-30T21:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.656465 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.656536 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.656561 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.656591 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.656616 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:48Z","lastTransitionTime":"2026-01-30T21:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.759980 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.760033 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.760053 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.760078 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.760097 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:48Z","lastTransitionTime":"2026-01-30T21:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.804051 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 09:00:38.28257286 +0000 UTC Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.817385 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:48 crc kubenswrapper[4914]: E0130 21:15:48.817532 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.853661 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.853740 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.853758 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.853781 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.853793 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:48Z","lastTransitionTime":"2026-01-30T21:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:48 crc kubenswrapper[4914]: E0130 21:15:48.873580 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.879281 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.879464 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.879570 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.879750 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.879861 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:48Z","lastTransitionTime":"2026-01-30T21:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:48 crc kubenswrapper[4914]: E0130 21:15:48.901494 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.906507 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.906544 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.906556 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.906575 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.906588 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:48Z","lastTransitionTime":"2026-01-30T21:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:48 crc kubenswrapper[4914]: E0130 21:15:48.925886 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.931701 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.931863 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.931887 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.931910 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.931928 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:48Z","lastTransitionTime":"2026-01-30T21:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:48 crc kubenswrapper[4914]: E0130 21:15:48.953894 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.959377 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.959484 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.959565 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.959674 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.959750 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:48Z","lastTransitionTime":"2026-01-30T21:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:48 crc kubenswrapper[4914]: E0130 21:15:48.983814 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f33c804c-e82d-481d-b93f-218591a98a10\\\",\\\"systemUUID\\\":\\\"04fc677e-7e41-47a1-8a02-3259b15b63c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:15:48Z is after 2025-08-24T17:21:41Z" Jan 30 21:15:48 crc kubenswrapper[4914]: E0130 21:15:48.984059 4914 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.986180 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.986237 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.986251 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.986269 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:48 crc kubenswrapper[4914]: I0130 21:15:48.986283 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:48Z","lastTransitionTime":"2026-01-30T21:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.089195 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.089265 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.089288 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.089320 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.089371 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:49Z","lastTransitionTime":"2026-01-30T21:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.193606 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.193666 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.193684 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.193762 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.193783 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:49Z","lastTransitionTime":"2026-01-30T21:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.296231 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.296582 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.296780 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.296944 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.297085 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:49Z","lastTransitionTime":"2026-01-30T21:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.399037 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.399113 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.399133 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.399155 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.399169 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:49Z","lastTransitionTime":"2026-01-30T21:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.501565 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.501612 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.501625 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.501643 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.501656 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:49Z","lastTransitionTime":"2026-01-30T21:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.604856 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.604920 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.604944 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.604975 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.604997 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:49Z","lastTransitionTime":"2026-01-30T21:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.708234 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.708268 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.708279 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.708296 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.708307 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:49Z","lastTransitionTime":"2026-01-30T21:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.804174 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 21:29:16.162153546 +0000 UTC Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.811731 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.811784 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.811806 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.811838 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.811859 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:49Z","lastTransitionTime":"2026-01-30T21:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.817355 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.817525 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:49 crc kubenswrapper[4914]: E0130 21:15:49.817677 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:49 crc kubenswrapper[4914]: E0130 21:15:49.817516 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.817375 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:49 crc kubenswrapper[4914]: E0130 21:15:49.817843 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.915256 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.915295 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.915307 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.915321 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:49 crc kubenswrapper[4914]: I0130 21:15:49.915331 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:49Z","lastTransitionTime":"2026-01-30T21:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.017813 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.017877 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.017895 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.017922 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.017940 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:50Z","lastTransitionTime":"2026-01-30T21:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.121480 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.121541 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.121558 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.121583 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.121602 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:50Z","lastTransitionTime":"2026-01-30T21:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.224357 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.224686 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.224911 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.225095 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.225240 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:50Z","lastTransitionTime":"2026-01-30T21:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.328491 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.329098 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.329245 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.329403 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.329546 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:50Z","lastTransitionTime":"2026-01-30T21:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.432416 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.432465 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.432481 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.432505 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.432524 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:50Z","lastTransitionTime":"2026-01-30T21:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.535629 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.535740 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.535766 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.535794 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.535812 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:50Z","lastTransitionTime":"2026-01-30T21:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.639258 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.639617 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.639800 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.639973 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.640154 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:50Z","lastTransitionTime":"2026-01-30T21:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.742835 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.742892 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.742909 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.742933 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.742952 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:50Z","lastTransitionTime":"2026-01-30T21:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.804923 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 10:20:39.379194054 +0000 UTC Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.817384 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:50 crc kubenswrapper[4914]: E0130 21:15:50.817563 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.845769 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.845924 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.845945 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.845969 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.845986 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:50Z","lastTransitionTime":"2026-01-30T21:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.949189 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.949286 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.949305 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.949366 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:50 crc kubenswrapper[4914]: I0130 21:15:50.949387 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:50Z","lastTransitionTime":"2026-01-30T21:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.053093 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.053150 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.053166 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.053188 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.053206 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:51Z","lastTransitionTime":"2026-01-30T21:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.157181 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.157276 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.157299 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.157331 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.157351 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:51Z","lastTransitionTime":"2026-01-30T21:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.261130 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.261225 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.261243 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.261270 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.261287 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:51Z","lastTransitionTime":"2026-01-30T21:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.363489 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.363566 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.363583 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.363609 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.363626 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:51Z","lastTransitionTime":"2026-01-30T21:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.466489 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.466610 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.466629 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.466688 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.466743 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:51Z","lastTransitionTime":"2026-01-30T21:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.568285 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.568622 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.568550752 +0000 UTC m=+149.007187553 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.568761 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.568829 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.568870 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.568930 4914 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.568997 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.568980862 +0000 UTC m=+149.007617663 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.569038 4914 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.569072 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.569100 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.569119 4914 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.569146 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.569118185 +0000 UTC m=+149.007754986 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.569190 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.569165476 +0000 UTC m=+149.007802277 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.569824 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.569872 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.569892 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.569918 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.569935 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:51Z","lastTransitionTime":"2026-01-30T21:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.669521 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.669829 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.669857 4914 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.669876 4914 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.669946 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.669923248 +0000 UTC m=+149.108560049 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.672468 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.672682 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.672899 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.673044 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.673187 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:51Z","lastTransitionTime":"2026-01-30T21:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.770961 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs\") pod \"network-metrics-daemon-c2klk\" (UID: \"8a911963-1d06-47d0-8f70-d81d5bd47496\") " pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.771274 4914 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.771416 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs podName:8a911963-1d06-47d0-8f70-d81d5bd47496 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.771385167 +0000 UTC m=+149.210021958 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs") pod "network-metrics-daemon-c2klk" (UID: "8a911963-1d06-47d0-8f70-d81d5bd47496") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.776252 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.776314 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.776332 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.776355 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.776372 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:51Z","lastTransitionTime":"2026-01-30T21:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.805932 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 08:54:23.04384798 +0000 UTC Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.817502 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.817858 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.818010 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.818184 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.818365 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:51 crc kubenswrapper[4914]: E0130 21:15:51.818642 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.879635 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.879892 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.879906 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.879923 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.879935 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:51Z","lastTransitionTime":"2026-01-30T21:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.983363 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.983596 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.983778 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.983949 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:51 crc kubenswrapper[4914]: I0130 21:15:51.984115 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:51Z","lastTransitionTime":"2026-01-30T21:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.086842 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.086902 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.086919 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.086945 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.086964 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:52Z","lastTransitionTime":"2026-01-30T21:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.189151 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.189838 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.190027 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.190216 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.190359 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:52Z","lastTransitionTime":"2026-01-30T21:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.294005 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.294113 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.294135 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.294163 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.294189 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:52Z","lastTransitionTime":"2026-01-30T21:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.397303 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.397353 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.397365 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.397383 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.397396 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:52Z","lastTransitionTime":"2026-01-30T21:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.499585 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.499647 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.499666 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.499690 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.499739 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:52Z","lastTransitionTime":"2026-01-30T21:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.603322 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.603404 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.603428 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.603457 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.603484 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:52Z","lastTransitionTime":"2026-01-30T21:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.707103 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.707159 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.707176 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.707199 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.707216 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:52Z","lastTransitionTime":"2026-01-30T21:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.807404 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 07:32:05.743548992 +0000 UTC Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.809630 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.809702 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.809816 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.809849 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.809875 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:52Z","lastTransitionTime":"2026-01-30T21:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.817349 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:52 crc kubenswrapper[4914]: E0130 21:15:52.817729 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.912543 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.912605 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.912627 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.912657 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:52 crc kubenswrapper[4914]: I0130 21:15:52.912678 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:52Z","lastTransitionTime":"2026-01-30T21:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.015365 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.015430 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.015447 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.015471 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.015492 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:53Z","lastTransitionTime":"2026-01-30T21:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.118664 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.118775 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.118798 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.118829 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.118851 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:53Z","lastTransitionTime":"2026-01-30T21:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.221659 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.221745 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.221768 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.221795 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.221815 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:53Z","lastTransitionTime":"2026-01-30T21:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.324755 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.324832 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.324858 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.324884 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.324900 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:53Z","lastTransitionTime":"2026-01-30T21:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.427655 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.427757 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.427776 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.427802 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.427823 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:53Z","lastTransitionTime":"2026-01-30T21:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.531608 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.531680 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.531760 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.531795 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.531821 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:53Z","lastTransitionTime":"2026-01-30T21:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.635373 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.635466 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.635491 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.635517 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.635538 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:53Z","lastTransitionTime":"2026-01-30T21:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.738553 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.738582 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.738591 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.738636 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.738647 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:53Z","lastTransitionTime":"2026-01-30T21:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.808402 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 14:06:19.976375954 +0000 UTC Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.817830 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.817846 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:53 crc kubenswrapper[4914]: E0130 21:15:53.818003 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.818064 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:53 crc kubenswrapper[4914]: E0130 21:15:53.818136 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:53 crc kubenswrapper[4914]: E0130 21:15:53.818221 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.841555 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.841619 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.841637 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.841662 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.841683 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:53Z","lastTransitionTime":"2026-01-30T21:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.944921 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.944990 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.945009 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.945035 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:53 crc kubenswrapper[4914]: I0130 21:15:53.945054 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:53Z","lastTransitionTime":"2026-01-30T21:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.048075 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.048131 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.048147 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.048170 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.048188 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:54Z","lastTransitionTime":"2026-01-30T21:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.150603 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.150675 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.150694 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.150755 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.150775 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:54Z","lastTransitionTime":"2026-01-30T21:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.254382 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.254447 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.254470 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.254500 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.254524 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:54Z","lastTransitionTime":"2026-01-30T21:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.357891 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.357953 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.357971 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.357994 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.358013 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:54Z","lastTransitionTime":"2026-01-30T21:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.461386 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.461444 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.461462 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.461485 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.461504 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:54Z","lastTransitionTime":"2026-01-30T21:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.564507 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.564568 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.564586 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.564609 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.564627 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:54Z","lastTransitionTime":"2026-01-30T21:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.668590 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.668674 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.668693 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.668762 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.668781 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:54Z","lastTransitionTime":"2026-01-30T21:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.771891 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.771937 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.771947 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.771965 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.771978 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:54Z","lastTransitionTime":"2026-01-30T21:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.809355 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 05:44:33.758007271 +0000 UTC Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.817694 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:54 crc kubenswrapper[4914]: E0130 21:15:54.817903 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.875484 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.875537 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.875554 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.875575 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.875594 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:54Z","lastTransitionTime":"2026-01-30T21:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.979200 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.979265 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.979283 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.979310 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:54 crc kubenswrapper[4914]: I0130 21:15:54.979332 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:54Z","lastTransitionTime":"2026-01-30T21:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.082134 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.082198 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.082215 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.082241 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.082259 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:55Z","lastTransitionTime":"2026-01-30T21:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.185379 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.185442 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.185459 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.185486 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.185509 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:55Z","lastTransitionTime":"2026-01-30T21:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.289122 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.289183 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.289200 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.289227 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.289245 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:55Z","lastTransitionTime":"2026-01-30T21:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.391836 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.391905 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.391928 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.391958 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.391977 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:55Z","lastTransitionTime":"2026-01-30T21:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.494928 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.494992 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.495010 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.495038 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.495054 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:55Z","lastTransitionTime":"2026-01-30T21:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.598303 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.598403 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.598421 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.598443 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.598460 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:55Z","lastTransitionTime":"2026-01-30T21:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.702930 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.702987 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.703027 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.703057 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.703078 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:55Z","lastTransitionTime":"2026-01-30T21:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.805419 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.805479 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.805498 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.805524 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.805542 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:55Z","lastTransitionTime":"2026-01-30T21:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.810144 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 22:41:40.512637423 +0000 UTC Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.817778 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.817850 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.817849 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:55 crc kubenswrapper[4914]: E0130 21:15:55.817981 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:55 crc kubenswrapper[4914]: E0130 21:15:55.818105 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:55 crc kubenswrapper[4914]: E0130 21:15:55.818222 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.907825 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.907883 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.907899 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.907921 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:55 crc kubenswrapper[4914]: I0130 21:15:55.907938 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:55Z","lastTransitionTime":"2026-01-30T21:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.010959 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.011023 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.011039 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.011066 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.011083 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:56Z","lastTransitionTime":"2026-01-30T21:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.113821 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.113876 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.113892 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.113915 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.113936 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:56Z","lastTransitionTime":"2026-01-30T21:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.216858 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.216933 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.216953 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.216979 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.217003 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:56Z","lastTransitionTime":"2026-01-30T21:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.319585 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.319649 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.319667 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.319692 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.319743 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:56Z","lastTransitionTime":"2026-01-30T21:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.422549 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.422608 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.422627 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.422653 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.422671 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:56Z","lastTransitionTime":"2026-01-30T21:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.525645 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.525846 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.525879 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.525918 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.525943 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:56Z","lastTransitionTime":"2026-01-30T21:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.628875 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.628937 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.628954 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.628984 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.629003 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:56Z","lastTransitionTime":"2026-01-30T21:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.738858 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.739481 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.739889 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.740108 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.740306 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:56Z","lastTransitionTime":"2026-01-30T21:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.811145 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 17:43:11.671167482 +0000 UTC Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.818127 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:56 crc kubenswrapper[4914]: E0130 21:15:56.818488 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.844376 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.844427 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.844479 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.844503 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.844549 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:56Z","lastTransitionTime":"2026-01-30T21:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.947327 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.947423 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.947441 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.947498 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:56 crc kubenswrapper[4914]: I0130 21:15:56.947517 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:56Z","lastTransitionTime":"2026-01-30T21:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.051009 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.051157 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.051185 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.051252 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.051278 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:57Z","lastTransitionTime":"2026-01-30T21:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.154170 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.154208 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.154216 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.154231 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.154240 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:57Z","lastTransitionTime":"2026-01-30T21:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.257596 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.257687 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.257761 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.257789 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.257808 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:57Z","lastTransitionTime":"2026-01-30T21:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.360760 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.360807 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.360824 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.360843 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.360861 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:57Z","lastTransitionTime":"2026-01-30T21:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.462880 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.462951 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.462962 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.462979 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.462992 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:57Z","lastTransitionTime":"2026-01-30T21:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.566509 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.566601 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.566620 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.566646 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.566693 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:57Z","lastTransitionTime":"2026-01-30T21:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.669180 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.669231 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.669242 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.669261 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.669273 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:57Z","lastTransitionTime":"2026-01-30T21:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.771979 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.772040 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.772057 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.772082 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.772100 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:57Z","lastTransitionTime":"2026-01-30T21:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.812336 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 22:09:59.797967518 +0000 UTC Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.817861 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.817862 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:57 crc kubenswrapper[4914]: E0130 21:15:57.818228 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:57 crc kubenswrapper[4914]: E0130 21:15:57.818433 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.817990 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:57 crc kubenswrapper[4914]: E0130 21:15:57.818935 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.867284 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=16.867265257 podStartE2EDuration="16.867265257s" podCreationTimestamp="2026-01-30 21:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:15:57.842284976 +0000 UTC m=+91.280921737" watchObservedRunningTime="2026-01-30 21:15:57.867265257 +0000 UTC m=+91.305902018" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.875569 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.875599 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.875609 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.875626 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.875638 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:57Z","lastTransitionTime":"2026-01-30T21:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.891286 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=70.891263575 podStartE2EDuration="1m10.891263575s" podCreationTimestamp="2026-01-30 21:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:15:57.867911792 +0000 UTC m=+91.306548563" watchObservedRunningTime="2026-01-30 21:15:57.891263575 +0000 UTC m=+91.329900376" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.941362 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podStartSLOduration=70.941341109 podStartE2EDuration="1m10.941341109s" podCreationTimestamp="2026-01-30 21:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:15:57.923733436 +0000 UTC m=+91.362370197" watchObservedRunningTime="2026-01-30 21:15:57.941341109 +0000 UTC m=+91.379977900" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.941805 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7xn26" podStartSLOduration=69.941797509 podStartE2EDuration="1m9.941797509s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:15:57.941296808 +0000 UTC m=+91.379933609" watchObservedRunningTime="2026-01-30 21:15:57.941797509 +0000 UTC m=+91.380434310" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.977654 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.977747 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.977767 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.977794 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:57 crc kubenswrapper[4914]: I0130 21:15:57.977813 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:57Z","lastTransitionTime":"2026-01-30T21:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.008151 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=70.008120744 podStartE2EDuration="1m10.008120744s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:15:58.00661308 +0000 UTC m=+91.445249871" watchObservedRunningTime="2026-01-30 21:15:58.008120744 +0000 UTC m=+91.446757545" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.026088 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=68.026057834 podStartE2EDuration="1m8.026057834s" podCreationTimestamp="2026-01-30 21:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:15:58.02543086 +0000 UTC m=+91.464067661" watchObservedRunningTime="2026-01-30 21:15:58.026057834 +0000 UTC m=+91.464694635" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.066941 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=36.066909688 podStartE2EDuration="36.066909688s" podCreationTimestamp="2026-01-30 21:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:15:58.043884852 +0000 UTC m=+91.482521613" watchObservedRunningTime="2026-01-30 21:15:58.066909688 +0000 UTC m=+91.505546489" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.080367 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.080415 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.080434 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.080456 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.080473 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:58Z","lastTransitionTime":"2026-01-30T21:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.157198 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wvbd7" podStartSLOduration=71.15717607 podStartE2EDuration="1m11.15717607s" podCreationTimestamp="2026-01-30 21:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:15:58.132312472 +0000 UTC m=+91.570949263" watchObservedRunningTime="2026-01-30 21:15:58.15717607 +0000 UTC m=+91.595812841" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.157344 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wt7n5" podStartSLOduration=71.157338474 podStartE2EDuration="1m11.157338474s" podCreationTimestamp="2026-01-30 21:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:15:58.155969163 +0000 UTC m=+91.594605964" watchObservedRunningTime="2026-01-30 21:15:58.157338474 +0000 UTC m=+91.595975245" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.184186 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.184236 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.184248 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.184268 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.184281 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:58Z","lastTransitionTime":"2026-01-30T21:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.230894 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z2dvv" podStartSLOduration=70.230868014 podStartE2EDuration="1m10.230868014s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:15:58.229547124 +0000 UTC m=+91.668183925" watchObservedRunningTime="2026-01-30 21:15:58.230868014 +0000 UTC m=+91.669504815" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.231565 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zxtk5" podStartSLOduration=71.23155646 podStartE2EDuration="1m11.23155646s" podCreationTimestamp="2026-01-30 21:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:15:58.208539204 +0000 UTC m=+91.647176005" watchObservedRunningTime="2026-01-30 21:15:58.23155646 +0000 UTC m=+91.670193261" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.286465 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.286522 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.286539 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.286566 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.286584 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:58Z","lastTransitionTime":"2026-01-30T21:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.389092 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.389158 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.389176 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.389202 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.389223 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:58Z","lastTransitionTime":"2026-01-30T21:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.492821 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.492865 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.492873 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.492888 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.492900 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:58Z","lastTransitionTime":"2026-01-30T21:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.595505 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.595567 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.595584 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.595609 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.595630 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:58Z","lastTransitionTime":"2026-01-30T21:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.698123 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.698186 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.698198 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.698218 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.698230 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:58Z","lastTransitionTime":"2026-01-30T21:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.801579 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.801642 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.801656 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.801675 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.801689 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:58Z","lastTransitionTime":"2026-01-30T21:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.812812 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 07:23:01.175549636 +0000 UTC Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.817251 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:15:58 crc kubenswrapper[4914]: E0130 21:15:58.817420 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.905175 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.905249 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.905273 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.905305 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:58 crc kubenswrapper[4914]: I0130 21:15:58.905332 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:58Z","lastTransitionTime":"2026-01-30T21:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.009179 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.009245 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.009266 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.009293 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.009310 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:59Z","lastTransitionTime":"2026-01-30T21:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.112745 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.112812 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.112829 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.112854 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.112873 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:59Z","lastTransitionTime":"2026-01-30T21:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.215944 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.216000 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.216017 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.216041 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.216059 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:59Z","lastTransitionTime":"2026-01-30T21:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.221104 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.221161 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.221179 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.221204 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.221224 4914 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:15:59Z","lastTransitionTime":"2026-01-30T21:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.282120 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq"] Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.282837 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.287673 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.291246 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.291384 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.292336 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.370271 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.370393 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.370428 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.370487 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.370583 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.471229 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.471339 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.471379 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.471444 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.471447 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.471553 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.471653 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.472842 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.486390 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.501262 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-btfdq\" (UID: \"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.608841 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" Jan 30 21:15:59 crc kubenswrapper[4914]: W0130 21:15:59.629280 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd277ac8d_11a4_42d2_94b8_e8d8ff21bdd2.slice/crio-ba71fd570b3cbcfb22dfdaf905eb2ef9448aa86990aadf3a76a548da3eaa3c7f WatchSource:0}: Error finding container ba71fd570b3cbcfb22dfdaf905eb2ef9448aa86990aadf3a76a548da3eaa3c7f: Status 404 returned error can't find the container with id ba71fd570b3cbcfb22dfdaf905eb2ef9448aa86990aadf3a76a548da3eaa3c7f Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.813598 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 20:04:50.310916631 +0000 UTC Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.814031 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.817196 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.817265 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.817354 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:15:59 crc kubenswrapper[4914]: E0130 21:15:59.817567 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:15:59 crc kubenswrapper[4914]: E0130 21:15:59.817684 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:15:59 crc kubenswrapper[4914]: E0130 21:15:59.818024 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.820183 4914 scope.go:117] "RemoveContainer" containerID="11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0" Jan 30 21:15:59 crc kubenswrapper[4914]: E0130 21:15:59.820494 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" Jan 30 21:15:59 crc kubenswrapper[4914]: I0130 21:15:59.825325 4914 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 30 21:16:00 crc kubenswrapper[4914]: I0130 21:16:00.389580 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" event={"ID":"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2","Type":"ContainerStarted","Data":"80eef2377e61a8bdc541aed09ed8529d39a0c1dd9e76915a4f1d2841b8f20ac3"} Jan 30 21:16:00 crc kubenswrapper[4914]: I0130 21:16:00.389653 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" event={"ID":"d277ac8d-11a4-42d2-94b8-e8d8ff21bdd2","Type":"ContainerStarted","Data":"ba71fd570b3cbcfb22dfdaf905eb2ef9448aa86990aadf3a76a548da3eaa3c7f"} Jan 30 21:16:00 crc kubenswrapper[4914]: I0130 21:16:00.817533 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:00 crc kubenswrapper[4914]: E0130 21:16:00.817757 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:01 crc kubenswrapper[4914]: I0130 21:16:01.817805 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:01 crc kubenswrapper[4914]: I0130 21:16:01.817839 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:01 crc kubenswrapper[4914]: I0130 21:16:01.817847 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:01 crc kubenswrapper[4914]: E0130 21:16:01.817979 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:01 crc kubenswrapper[4914]: E0130 21:16:01.818177 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:01 crc kubenswrapper[4914]: E0130 21:16:01.818342 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:02 crc kubenswrapper[4914]: I0130 21:16:02.817353 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:02 crc kubenswrapper[4914]: E0130 21:16:02.817670 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:03 crc kubenswrapper[4914]: I0130 21:16:03.817752 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:03 crc kubenswrapper[4914]: I0130 21:16:03.817821 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:03 crc kubenswrapper[4914]: I0130 21:16:03.817885 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:03 crc kubenswrapper[4914]: E0130 21:16:03.817943 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:03 crc kubenswrapper[4914]: E0130 21:16:03.818236 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:03 crc kubenswrapper[4914]: E0130 21:16:03.818126 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:04 crc kubenswrapper[4914]: I0130 21:16:04.817859 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:04 crc kubenswrapper[4914]: E0130 21:16:04.817970 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:05 crc kubenswrapper[4914]: I0130 21:16:05.817534 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:05 crc kubenswrapper[4914]: E0130 21:16:05.817754 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:05 crc kubenswrapper[4914]: I0130 21:16:05.817799 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:05 crc kubenswrapper[4914]: E0130 21:16:05.818066 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:05 crc kubenswrapper[4914]: I0130 21:16:05.818155 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:05 crc kubenswrapper[4914]: E0130 21:16:05.818310 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:06 crc kubenswrapper[4914]: I0130 21:16:06.817756 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:06 crc kubenswrapper[4914]: E0130 21:16:06.818243 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:07 crc kubenswrapper[4914]: I0130 21:16:07.817175 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:07 crc kubenswrapper[4914]: I0130 21:16:07.817197 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:07 crc kubenswrapper[4914]: I0130 21:16:07.817379 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:07 crc kubenswrapper[4914]: E0130 21:16:07.819315 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:07 crc kubenswrapper[4914]: E0130 21:16:07.820180 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:07 crc kubenswrapper[4914]: E0130 21:16:07.820356 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:08 crc kubenswrapper[4914]: I0130 21:16:08.817311 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:08 crc kubenswrapper[4914]: E0130 21:16:08.817530 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:09 crc kubenswrapper[4914]: I0130 21:16:09.818050 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:09 crc kubenswrapper[4914]: I0130 21:16:09.818115 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:09 crc kubenswrapper[4914]: I0130 21:16:09.818136 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:09 crc kubenswrapper[4914]: E0130 21:16:09.818268 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:09 crc kubenswrapper[4914]: E0130 21:16:09.818486 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:09 crc kubenswrapper[4914]: E0130 21:16:09.818795 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:10 crc kubenswrapper[4914]: I0130 21:16:10.817615 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:10 crc kubenswrapper[4914]: E0130 21:16:10.818575 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:11 crc kubenswrapper[4914]: I0130 21:16:11.817264 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:11 crc kubenswrapper[4914]: I0130 21:16:11.817331 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:11 crc kubenswrapper[4914]: E0130 21:16:11.817640 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:11 crc kubenswrapper[4914]: I0130 21:16:11.817880 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:11 crc kubenswrapper[4914]: E0130 21:16:11.818435 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:11 crc kubenswrapper[4914]: E0130 21:16:11.818632 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:11 crc kubenswrapper[4914]: I0130 21:16:11.819326 4914 scope.go:117] "RemoveContainer" containerID="11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0" Jan 30 21:16:11 crc kubenswrapper[4914]: E0130 21:16:11.819624 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" Jan 30 21:16:12 crc kubenswrapper[4914]: I0130 21:16:12.817854 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:12 crc kubenswrapper[4914]: E0130 21:16:12.818034 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:13 crc kubenswrapper[4914]: I0130 21:16:13.817570 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:13 crc kubenswrapper[4914]: I0130 21:16:13.817633 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:13 crc kubenswrapper[4914]: E0130 21:16:13.817834 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:13 crc kubenswrapper[4914]: I0130 21:16:13.817950 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:13 crc kubenswrapper[4914]: E0130 21:16:13.818121 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:13 crc kubenswrapper[4914]: E0130 21:16:13.818274 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:14 crc kubenswrapper[4914]: I0130 21:16:14.817813 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:14 crc kubenswrapper[4914]: E0130 21:16:14.817986 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:15 crc kubenswrapper[4914]: I0130 21:16:15.817222 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:15 crc kubenswrapper[4914]: I0130 21:16:15.817323 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:15 crc kubenswrapper[4914]: E0130 21:16:15.817435 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:15 crc kubenswrapper[4914]: I0130 21:16:15.817465 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:15 crc kubenswrapper[4914]: E0130 21:16:15.817626 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:15 crc kubenswrapper[4914]: E0130 21:16:15.817797 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:16 crc kubenswrapper[4914]: I0130 21:16:16.817676 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:16 crc kubenswrapper[4914]: E0130 21:16:16.817864 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:17 crc kubenswrapper[4914]: I0130 21:16:17.817797 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:17 crc kubenswrapper[4914]: I0130 21:16:17.818907 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:17 crc kubenswrapper[4914]: E0130 21:16:17.819086 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:17 crc kubenswrapper[4914]: I0130 21:16:17.819115 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:17 crc kubenswrapper[4914]: E0130 21:16:17.819257 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:17 crc kubenswrapper[4914]: E0130 21:16:17.819311 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:18 crc kubenswrapper[4914]: I0130 21:16:18.817882 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:18 crc kubenswrapper[4914]: E0130 21:16:18.818948 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:19 crc kubenswrapper[4914]: I0130 21:16:19.817815 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:19 crc kubenswrapper[4914]: I0130 21:16:19.817815 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:19 crc kubenswrapper[4914]: I0130 21:16:19.819218 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:19 crc kubenswrapper[4914]: E0130 21:16:19.819361 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:19 crc kubenswrapper[4914]: E0130 21:16:19.819526 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:19 crc kubenswrapper[4914]: E0130 21:16:19.819594 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:20 crc kubenswrapper[4914]: I0130 21:16:20.817011 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:20 crc kubenswrapper[4914]: E0130 21:16:20.817153 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:21 crc kubenswrapper[4914]: I0130 21:16:21.817216 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:21 crc kubenswrapper[4914]: E0130 21:16:21.817788 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:21 crc kubenswrapper[4914]: I0130 21:16:21.818429 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:21 crc kubenswrapper[4914]: E0130 21:16:21.818551 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:21 crc kubenswrapper[4914]: I0130 21:16:21.822044 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:21 crc kubenswrapper[4914]: E0130 21:16:21.822351 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:22 crc kubenswrapper[4914]: I0130 21:16:22.468450 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvbd7_c1067fc5-9bff-4a81-982f-b2cca1c432d0/kube-multus/1.log" Jan 30 21:16:22 crc kubenswrapper[4914]: I0130 21:16:22.469263 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvbd7_c1067fc5-9bff-4a81-982f-b2cca1c432d0/kube-multus/0.log" Jan 30 21:16:22 crc kubenswrapper[4914]: I0130 21:16:22.469490 4914 generic.go:334] "Generic (PLEG): container finished" podID="c1067fc5-9bff-4a81-982f-b2cca1c432d0" containerID="556e77646daeedff4e7f95f018b7c7bec78863ade5c39385eb31ec26341e4d7d" exitCode=1 Jan 30 21:16:22 crc kubenswrapper[4914]: I0130 21:16:22.469618 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvbd7" event={"ID":"c1067fc5-9bff-4a81-982f-b2cca1c432d0","Type":"ContainerDied","Data":"556e77646daeedff4e7f95f018b7c7bec78863ade5c39385eb31ec26341e4d7d"} Jan 30 21:16:22 crc kubenswrapper[4914]: I0130 21:16:22.469695 4914 scope.go:117] "RemoveContainer" containerID="ea62c18f7a63c1c1f20abc73e0899a41820a4d86d2ecf998567f4a54d9acff3b" Jan 30 21:16:22 crc kubenswrapper[4914]: I0130 21:16:22.470309 4914 scope.go:117] "RemoveContainer" containerID="556e77646daeedff4e7f95f018b7c7bec78863ade5c39385eb31ec26341e4d7d" Jan 30 21:16:22 crc kubenswrapper[4914]: E0130 21:16:22.470595 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-wvbd7_openshift-multus(c1067fc5-9bff-4a81-982f-b2cca1c432d0)\"" pod="openshift-multus/multus-wvbd7" podUID="c1067fc5-9bff-4a81-982f-b2cca1c432d0" Jan 30 21:16:22 crc kubenswrapper[4914]: I0130 21:16:22.496808 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-btfdq" podStartSLOduration=94.496764694 podStartE2EDuration="1m34.496764694s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:00.408799021 +0000 UTC m=+93.847435812" watchObservedRunningTime="2026-01-30 21:16:22.496764694 +0000 UTC m=+115.935401465" Jan 30 21:16:22 crc kubenswrapper[4914]: I0130 21:16:22.817023 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:22 crc kubenswrapper[4914]: E0130 21:16:22.817205 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:23 crc kubenswrapper[4914]: I0130 21:16:23.476639 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvbd7_c1067fc5-9bff-4a81-982f-b2cca1c432d0/kube-multus/1.log" Jan 30 21:16:23 crc kubenswrapper[4914]: I0130 21:16:23.817808 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:23 crc kubenswrapper[4914]: I0130 21:16:23.817892 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:23 crc kubenswrapper[4914]: E0130 21:16:23.817998 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:23 crc kubenswrapper[4914]: I0130 21:16:23.818151 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:23 crc kubenswrapper[4914]: E0130 21:16:23.818207 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:23 crc kubenswrapper[4914]: E0130 21:16:23.818298 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:24 crc kubenswrapper[4914]: I0130 21:16:24.817211 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:24 crc kubenswrapper[4914]: E0130 21:16:24.817399 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:24 crc kubenswrapper[4914]: I0130 21:16:24.818441 4914 scope.go:117] "RemoveContainer" containerID="11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0" Jan 30 21:16:24 crc kubenswrapper[4914]: E0130 21:16:24.818791 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-hchqc_openshift-ovn-kubernetes(6a32fa1f-f3a9-4e60-b665-51138c3ce768)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" Jan 30 21:16:25 crc kubenswrapper[4914]: I0130 21:16:25.817684 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:25 crc kubenswrapper[4914]: E0130 21:16:25.817922 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:25 crc kubenswrapper[4914]: I0130 21:16:25.817729 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:25 crc kubenswrapper[4914]: E0130 21:16:25.818079 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:25 crc kubenswrapper[4914]: I0130 21:16:25.817687 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:25 crc kubenswrapper[4914]: E0130 21:16:25.818171 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:26 crc kubenswrapper[4914]: I0130 21:16:26.817689 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:26 crc kubenswrapper[4914]: E0130 21:16:26.817912 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:27 crc kubenswrapper[4914]: E0130 21:16:27.807821 4914 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 30 21:16:27 crc kubenswrapper[4914]: I0130 21:16:27.817564 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:27 crc kubenswrapper[4914]: I0130 21:16:27.817955 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:27 crc kubenswrapper[4914]: E0130 21:16:27.820036 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:27 crc kubenswrapper[4914]: I0130 21:16:27.820111 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:27 crc kubenswrapper[4914]: E0130 21:16:27.821037 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:27 crc kubenswrapper[4914]: E0130 21:16:27.821192 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:27 crc kubenswrapper[4914]: E0130 21:16:27.934550 4914 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:16:28 crc kubenswrapper[4914]: I0130 21:16:28.817964 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:28 crc kubenswrapper[4914]: E0130 21:16:28.818174 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:29 crc kubenswrapper[4914]: I0130 21:16:29.817946 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:29 crc kubenswrapper[4914]: I0130 21:16:29.818015 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:29 crc kubenswrapper[4914]: I0130 21:16:29.818015 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:29 crc kubenswrapper[4914]: E0130 21:16:29.818140 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:29 crc kubenswrapper[4914]: E0130 21:16:29.818282 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:29 crc kubenswrapper[4914]: E0130 21:16:29.818365 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:30 crc kubenswrapper[4914]: I0130 21:16:30.817306 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:30 crc kubenswrapper[4914]: E0130 21:16:30.817504 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:31 crc kubenswrapper[4914]: I0130 21:16:31.817873 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:31 crc kubenswrapper[4914]: I0130 21:16:31.818229 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:31 crc kubenswrapper[4914]: I0130 21:16:31.818307 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:31 crc kubenswrapper[4914]: E0130 21:16:31.818398 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:31 crc kubenswrapper[4914]: E0130 21:16:31.818508 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:31 crc kubenswrapper[4914]: E0130 21:16:31.818642 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:32 crc kubenswrapper[4914]: I0130 21:16:32.818024 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:32 crc kubenswrapper[4914]: E0130 21:16:32.818259 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:32 crc kubenswrapper[4914]: E0130 21:16:32.937049 4914 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:16:33 crc kubenswrapper[4914]: I0130 21:16:33.817546 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:33 crc kubenswrapper[4914]: I0130 21:16:33.817638 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:33 crc kubenswrapper[4914]: I0130 21:16:33.817663 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:33 crc kubenswrapper[4914]: E0130 21:16:33.817840 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:33 crc kubenswrapper[4914]: E0130 21:16:33.818029 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:33 crc kubenswrapper[4914]: E0130 21:16:33.818285 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:33 crc kubenswrapper[4914]: I0130 21:16:33.818490 4914 scope.go:117] "RemoveContainer" containerID="556e77646daeedff4e7f95f018b7c7bec78863ade5c39385eb31ec26341e4d7d" Jan 30 21:16:34 crc kubenswrapper[4914]: I0130 21:16:34.526647 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvbd7_c1067fc5-9bff-4a81-982f-b2cca1c432d0/kube-multus/1.log" Jan 30 21:16:34 crc kubenswrapper[4914]: I0130 21:16:34.527145 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvbd7" event={"ID":"c1067fc5-9bff-4a81-982f-b2cca1c432d0","Type":"ContainerStarted","Data":"c2b89d677b10a0c9096fdbb15c317ca43c6c9d680a668ca53e06449829acfd01"} Jan 30 21:16:34 crc kubenswrapper[4914]: I0130 21:16:34.817627 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:34 crc kubenswrapper[4914]: E0130 21:16:34.817877 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:35 crc kubenswrapper[4914]: I0130 21:16:35.817602 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:35 crc kubenswrapper[4914]: E0130 21:16:35.818113 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:35 crc kubenswrapper[4914]: I0130 21:16:35.817694 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:35 crc kubenswrapper[4914]: I0130 21:16:35.817896 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:35 crc kubenswrapper[4914]: E0130 21:16:35.818333 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:35 crc kubenswrapper[4914]: E0130 21:16:35.818539 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:36 crc kubenswrapper[4914]: I0130 21:16:36.817258 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:36 crc kubenswrapper[4914]: E0130 21:16:36.817437 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:37 crc kubenswrapper[4914]: I0130 21:16:37.817684 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:37 crc kubenswrapper[4914]: I0130 21:16:37.817830 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:37 crc kubenswrapper[4914]: E0130 21:16:37.819514 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:37 crc kubenswrapper[4914]: I0130 21:16:37.819575 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:37 crc kubenswrapper[4914]: E0130 21:16:37.819877 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:37 crc kubenswrapper[4914]: E0130 21:16:37.820555 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:37 crc kubenswrapper[4914]: I0130 21:16:37.821078 4914 scope.go:117] "RemoveContainer" containerID="11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0" Jan 30 21:16:37 crc kubenswrapper[4914]: E0130 21:16:37.938386 4914 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:16:38 crc kubenswrapper[4914]: I0130 21:16:38.547197 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/3.log" Jan 30 21:16:38 crc kubenswrapper[4914]: I0130 21:16:38.549634 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerStarted","Data":"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635"} Jan 30 21:16:38 crc kubenswrapper[4914]: I0130 21:16:38.550216 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:16:38 crc kubenswrapper[4914]: I0130 21:16:38.817771 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:38 crc kubenswrapper[4914]: E0130 21:16:38.817941 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:38 crc kubenswrapper[4914]: I0130 21:16:38.876277 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podStartSLOduration=111.876258071 podStartE2EDuration="1m51.876258071s" podCreationTimestamp="2026-01-30 21:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:38.585001583 +0000 UTC m=+132.023638364" watchObservedRunningTime="2026-01-30 21:16:38.876258071 +0000 UTC m=+132.314894832" Jan 30 21:16:38 crc kubenswrapper[4914]: I0130 21:16:38.876968 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c2klk"] Jan 30 21:16:38 crc kubenswrapper[4914]: I0130 21:16:38.877082 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:38 crc kubenswrapper[4914]: E0130 21:16:38.877185 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:39 crc kubenswrapper[4914]: I0130 21:16:39.820551 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:39 crc kubenswrapper[4914]: E0130 21:16:39.820762 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:39 crc kubenswrapper[4914]: I0130 21:16:39.821014 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:39 crc kubenswrapper[4914]: E0130 21:16:39.821117 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:40 crc kubenswrapper[4914]: I0130 21:16:40.817084 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:40 crc kubenswrapper[4914]: I0130 21:16:40.817116 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:40 crc kubenswrapper[4914]: E0130 21:16:40.817282 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:40 crc kubenswrapper[4914]: E0130 21:16:40.817405 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:41 crc kubenswrapper[4914]: I0130 21:16:41.817346 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:41 crc kubenswrapper[4914]: I0130 21:16:41.817433 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:41 crc kubenswrapper[4914]: E0130 21:16:41.817880 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:41 crc kubenswrapper[4914]: E0130 21:16:41.817995 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:42 crc kubenswrapper[4914]: I0130 21:16:42.818005 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:42 crc kubenswrapper[4914]: I0130 21:16:42.818065 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:42 crc kubenswrapper[4914]: E0130 21:16:42.818214 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c2klk" podUID="8a911963-1d06-47d0-8f70-d81d5bd47496" Jan 30 21:16:42 crc kubenswrapper[4914]: E0130 21:16:42.818438 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:43 crc kubenswrapper[4914]: I0130 21:16:43.817634 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:43 crc kubenswrapper[4914]: I0130 21:16:43.817745 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:43 crc kubenswrapper[4914]: I0130 21:16:43.819892 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 30 21:16:43 crc kubenswrapper[4914]: I0130 21:16:43.820898 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 30 21:16:43 crc kubenswrapper[4914]: I0130 21:16:43.820920 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 30 21:16:43 crc kubenswrapper[4914]: I0130 21:16:43.822369 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 30 21:16:44 crc kubenswrapper[4914]: I0130 21:16:44.817842 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:44 crc kubenswrapper[4914]: I0130 21:16:44.818378 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:44 crc kubenswrapper[4914]: I0130 21:16:44.820316 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 30 21:16:44 crc kubenswrapper[4914]: I0130 21:16:44.821950 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.413381 4914 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.463560 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pscbd"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.464247 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.467571 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.468440 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.470854 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.471545 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.477785 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.478345 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.478766 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-85rbp"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.479158 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.480995 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.481308 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.482700 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.482910 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.483347 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.493312 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tspt2"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.496015 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2cd62"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.497047 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.497312 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.497550 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.499629 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.500088 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.500325 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.500445 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.500606 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.501230 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.501372 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.503826 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 21:16:50 crc kubenswrapper[4914]: W0130 21:16:50.508833 4914 reflector.go:561] object-"openshift-machine-api"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Jan 30 21:16:50 crc kubenswrapper[4914]: E0130 21:16:50.508913 4914 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 21:16:50 crc kubenswrapper[4914]: W0130 21:16:50.509997 4914 reflector.go:561] object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-samples-operator": no relationship found between node 'crc' and this object Jan 30 21:16:50 crc kubenswrapper[4914]: W0130 21:16:50.510102 4914 reflector.go:561] object-"openshift-cluster-samples-operator"/"samples-operator-tls": failed to list *v1.Secret: secrets "samples-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-samples-operator": no relationship found between node 'crc' and this object Jan 30 21:16:50 crc kubenswrapper[4914]: E0130 21:16:50.510135 4914 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"samples-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-samples-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.514086 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: W0130 21:16:50.514330 4914 reflector.go:561] object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w": failed to list *v1.Secret: secrets "cluster-samples-operator-dockercfg-xpp9w" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-samples-operator": no relationship found between node 'crc' and this object Jan 30 21:16:50 crc kubenswrapper[4914]: E0130 21:16:50.514367 4914 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-xpp9w\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cluster-samples-operator-dockercfg-xpp9w\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-samples-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 21:16:50 crc kubenswrapper[4914]: W0130 21:16:50.514466 4914 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Jan 30 21:16:50 crc kubenswrapper[4914]: E0130 21:16:50.514491 4914 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 21:16:50 crc kubenswrapper[4914]: W0130 21:16:50.514644 4914 reflector.go:561] object-"openshift-cluster-samples-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-samples-operator": no relationship found between node 'crc' and this object Jan 30 21:16:50 crc kubenswrapper[4914]: E0130 21:16:50.514672 4914 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-samples-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.514747 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.514955 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.520991 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.521182 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.522857 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.523043 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.523193 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.523391 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.523500 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.523595 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.524024 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.524122 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.527472 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 21:16:50 crc kubenswrapper[4914]: E0130 21:16:50.527568 4914 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-samples-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.527605 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.527832 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.528052 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.529338 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.536917 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.537205 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.537359 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.537507 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.540342 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd61313c-fbd2-486b-96c7-1f27ac8a3ac5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-79tl2\" (UID: \"dd61313c-fbd2-486b-96c7-1f27ac8a3ac5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.540440 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd61313c-fbd2-486b-96c7-1f27ac8a3ac5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-79tl2\" (UID: \"dd61313c-fbd2-486b-96c7-1f27ac8a3ac5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.540525 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b850795-7fca-417d-9e31-c319e45e2594-audit-policies\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.540606 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-images\") pod \"machine-api-operator-5694c8668f-85rbp\" (UID: \"6b3718ea-66f6-4f01-97c5-94c7c844e1a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.540678 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/247e526c-e643-4ffb-a6f2-b4678132b8a7-client-ca\") pod \"route-controller-manager-6576b87f9c-htz2z\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.540795 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/247e526c-e643-4ffb-a6f2-b4678132b8a7-serving-cert\") pod \"route-controller-manager-6576b87f9c-htz2z\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.540873 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bdabe348-b2e8-4c4c-a3d8-c5827a94e615-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tspt2\" (UID: \"bdabe348-b2e8-4c4c-a3d8-c5827a94e615\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.541014 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ttp8\" (UniqueName: \"kubernetes.io/projected/8b850795-7fca-417d-9e31-c319e45e2594-kube-api-access-8ttp8\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.541085 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-config\") pod \"machine-api-operator-5694c8668f-85rbp\" (UID: \"6b3718ea-66f6-4f01-97c5-94c7c844e1a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.541153 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz4sj\" (UniqueName: \"kubernetes.io/projected/3efefb06-dccd-4432-8a91-9ac951803c21-kube-api-access-jz4sj\") pod \"cluster-samples-operator-665b6dd947-648dg\" (UID: \"3efefb06-dccd-4432-8a91-9ac951803c21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.541229 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77g9c\" (UniqueName: \"kubernetes.io/projected/bdabe348-b2e8-4c4c-a3d8-c5827a94e615-kube-api-access-77g9c\") pod \"openshift-config-operator-7777fb866f-tspt2\" (UID: \"bdabe348-b2e8-4c4c-a3d8-c5827a94e615\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.541306 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b850795-7fca-417d-9e31-c319e45e2594-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.541374 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b850795-7fca-417d-9e31-c319e45e2594-audit-dir\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.541469 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7jvj\" (UniqueName: \"kubernetes.io/projected/247e526c-e643-4ffb-a6f2-b4678132b8a7-kube-api-access-r7jvj\") pod \"route-controller-manager-6576b87f9c-htz2z\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.541541 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3efefb06-dccd-4432-8a91-9ac951803c21-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-648dg\" (UID: \"3efefb06-dccd-4432-8a91-9ac951803c21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.541613 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zftk\" (UniqueName: \"kubernetes.io/projected/dd61313c-fbd2-486b-96c7-1f27ac8a3ac5-kube-api-access-4zftk\") pod \"openshift-controller-manager-operator-756b6f6bc6-79tl2\" (UID: \"dd61313c-fbd2-486b-96c7-1f27ac8a3ac5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.542357 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b850795-7fca-417d-9e31-c319e45e2594-serving-cert\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.542764 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-config\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.542876 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b850795-7fca-417d-9e31-c319e45e2594-etcd-client\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.542980 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b850795-7fca-417d-9e31-c319e45e2594-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.543099 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-85rbp\" (UID: \"6b3718ea-66f6-4f01-97c5-94c7c844e1a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.543214 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dff3a310-c986-4724-8862-6d609edb8612-serving-cert\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.543314 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-client-ca\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.543414 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b850795-7fca-417d-9e31-c319e45e2594-encryption-config\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.543536 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djd4k\" (UniqueName: \"kubernetes.io/projected/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-kube-api-access-djd4k\") pod \"machine-api-operator-5694c8668f-85rbp\" (UID: \"6b3718ea-66f6-4f01-97c5-94c7c844e1a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.543634 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5t5k\" (UniqueName: \"kubernetes.io/projected/dff3a310-c986-4724-8862-6d609edb8612-kube-api-access-l5t5k\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.543760 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/247e526c-e643-4ffb-a6f2-b4678132b8a7-config\") pod \"route-controller-manager-6576b87f9c-htz2z\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.543855 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdabe348-b2e8-4c4c-a3d8-c5827a94e615-serving-cert\") pod \"openshift-config-operator-7777fb866f-tspt2\" (UID: \"bdabe348-b2e8-4c4c-a3d8-c5827a94e615\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.543935 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.551046 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tfjll"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.551529 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.551557 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-l6777"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.551648 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.554694 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ws7lj"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.555155 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-md7dg"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.555291 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.555294 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.559043 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.559981 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-mx28l"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.560476 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.562050 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5tbtc"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.562540 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6tfzr"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.562643 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.562963 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.563432 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5tbtc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.564159 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mx28l" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.564473 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.565264 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.565458 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.568087 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.569117 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.569362 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.569442 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.570586 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l546h"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.571264 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.572646 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.573235 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.577155 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-scclv"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.577627 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.577689 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.580092 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.580285 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.580855 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.581091 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.581439 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.581624 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.581786 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.581943 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.581966 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.582005 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.582184 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.581631 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.582327 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.582295 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.582520 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.582629 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.582635 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.582647 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.582805 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.582872 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.582943 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.583071 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.583156 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.583182 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.591561 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.603071 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pscbd"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.603119 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x7vxp"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.605865 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x7vxp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.606191 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.614813 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.616323 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.630688 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.632093 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.654462 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b850795-7fca-417d-9e31-c319e45e2594-audit-dir\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.654590 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7jvj\" (UniqueName: \"kubernetes.io/projected/247e526c-e643-4ffb-a6f2-b4678132b8a7-kube-api-access-r7jvj\") pod \"route-controller-manager-6576b87f9c-htz2z\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.654767 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b850795-7fca-417d-9e31-c319e45e2594-audit-dir\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.654797 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3efefb06-dccd-4432-8a91-9ac951803c21-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-648dg\" (UID: \"3efefb06-dccd-4432-8a91-9ac951803c21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.655248 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zftk\" (UniqueName: \"kubernetes.io/projected/dd61313c-fbd2-486b-96c7-1f27ac8a3ac5-kube-api-access-4zftk\") pod \"openshift-controller-manager-operator-756b6f6bc6-79tl2\" (UID: \"dd61313c-fbd2-486b-96c7-1f27ac8a3ac5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.655531 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.655744 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.655947 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656045 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656203 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656061 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b850795-7fca-417d-9e31-c319e45e2594-serving-cert\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656409 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-config\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656506 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b850795-7fca-417d-9e31-c319e45e2594-etcd-client\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656591 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b850795-7fca-417d-9e31-c319e45e2594-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656669 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-85rbp\" (UID: \"6b3718ea-66f6-4f01-97c5-94c7c844e1a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656756 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dff3a310-c986-4724-8862-6d609edb8612-serving-cert\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656828 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-client-ca\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656908 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b850795-7fca-417d-9e31-c319e45e2594-encryption-config\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656976 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djd4k\" (UniqueName: \"kubernetes.io/projected/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-kube-api-access-djd4k\") pod \"machine-api-operator-5694c8668f-85rbp\" (UID: \"6b3718ea-66f6-4f01-97c5-94c7c844e1a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.657046 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5t5k\" (UniqueName: \"kubernetes.io/projected/dff3a310-c986-4724-8862-6d609edb8612-kube-api-access-l5t5k\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.657125 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/247e526c-e643-4ffb-a6f2-b4678132b8a7-config\") pod \"route-controller-manager-6576b87f9c-htz2z\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.657197 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdabe348-b2e8-4c4c-a3d8-c5827a94e615-serving-cert\") pod \"openshift-config-operator-7777fb866f-tspt2\" (UID: \"bdabe348-b2e8-4c4c-a3d8-c5827a94e615\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.657272 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.657342 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd61313c-fbd2-486b-96c7-1f27ac8a3ac5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-79tl2\" (UID: \"dd61313c-fbd2-486b-96c7-1f27ac8a3ac5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.657409 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd61313c-fbd2-486b-96c7-1f27ac8a3ac5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-79tl2\" (UID: \"dd61313c-fbd2-486b-96c7-1f27ac8a3ac5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.657479 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b850795-7fca-417d-9e31-c319e45e2594-audit-policies\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.657548 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-images\") pod \"machine-api-operator-5694c8668f-85rbp\" (UID: \"6b3718ea-66f6-4f01-97c5-94c7c844e1a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.657629 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/247e526c-e643-4ffb-a6f2-b4678132b8a7-client-ca\") pod \"route-controller-manager-6576b87f9c-htz2z\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.657699 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/247e526c-e643-4ffb-a6f2-b4678132b8a7-serving-cert\") pod \"route-controller-manager-6576b87f9c-htz2z\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.657814 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bdabe348-b2e8-4c4c-a3d8-c5827a94e615-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tspt2\" (UID: \"bdabe348-b2e8-4c4c-a3d8-c5827a94e615\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.657948 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ttp8\" (UniqueName: \"kubernetes.io/projected/8b850795-7fca-417d-9e31-c319e45e2594-kube-api-access-8ttp8\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.658023 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-config\") pod \"machine-api-operator-5694c8668f-85rbp\" (UID: \"6b3718ea-66f6-4f01-97c5-94c7c844e1a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.658097 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz4sj\" (UniqueName: \"kubernetes.io/projected/3efefb06-dccd-4432-8a91-9ac951803c21-kube-api-access-jz4sj\") pod \"cluster-samples-operator-665b6dd947-648dg\" (UID: \"3efefb06-dccd-4432-8a91-9ac951803c21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.658370 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77g9c\" (UniqueName: \"kubernetes.io/projected/bdabe348-b2e8-4c4c-a3d8-c5827a94e615-kube-api-access-77g9c\") pod \"openshift-config-operator-7777fb866f-tspt2\" (UID: \"bdabe348-b2e8-4c4c-a3d8-c5827a94e615\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.658444 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b850795-7fca-417d-9e31-c319e45e2594-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.659021 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b850795-7fca-417d-9e31-c319e45e2594-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.660293 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656412 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.660428 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bdabe348-b2e8-4c4c-a3d8-c5827a94e615-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tspt2\" (UID: \"bdabe348-b2e8-4c4c-a3d8-c5827a94e615\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.660645 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-config\") pod \"machine-api-operator-5694c8668f-85rbp\" (UID: \"6b3718ea-66f6-4f01-97c5-94c7c844e1a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.660766 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-images\") pod \"machine-api-operator-5694c8668f-85rbp\" (UID: \"6b3718ea-66f6-4f01-97c5-94c7c844e1a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.657663 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-config\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.661144 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-f65q2"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.661211 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b850795-7fca-417d-9e31-c319e45e2594-audit-policies\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.661401 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b850795-7fca-417d-9e31-c319e45e2594-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.662040 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-client-ca\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.662281 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/247e526c-e643-4ffb-a6f2-b4678132b8a7-config\") pod \"route-controller-manager-6576b87f9c-htz2z\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.663566 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b850795-7fca-417d-9e31-c319e45e2594-etcd-client\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.663775 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656449 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.664811 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd61313c-fbd2-486b-96c7-1f27ac8a3ac5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-79tl2\" (UID: \"dd61313c-fbd2-486b-96c7-1f27ac8a3ac5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.664854 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656496 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656510 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.656572 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.658237 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.658276 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.658306 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.658345 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.659059 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.659131 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.659184 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.659299 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.659429 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.658366 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.659543 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.659590 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.659911 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.666864 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.667011 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.665916 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dff3a310-c986-4724-8862-6d609edb8612-serving-cert\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.667404 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b850795-7fca-417d-9e31-c319e45e2594-serving-cert\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.669377 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/247e526c-e643-4ffb-a6f2-b4678132b8a7-client-ca\") pod \"route-controller-manager-6576b87f9c-htz2z\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.669401 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.670410 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-85rbp\" (UID: \"6b3718ea-66f6-4f01-97c5-94c7c844e1a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.669801 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.670606 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.669835 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.670858 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.671162 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.671292 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.671347 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.672285 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.672354 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.672892 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.673050 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.673308 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.673587 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.673784 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.673868 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.674085 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.674652 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/247e526c-e643-4ffb-a6f2-b4678132b8a7-serving-cert\") pod \"route-controller-manager-6576b87f9c-htz2z\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.675218 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdabe348-b2e8-4c4c-a3d8-c5827a94e615-serving-cert\") pod \"openshift-config-operator-7777fb866f-tspt2\" (UID: \"bdabe348-b2e8-4c4c-a3d8-c5827a94e615\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.675623 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p49xt"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.675277 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.676143 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p49xt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.676415 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b850795-7fca-417d-9e31-c319e45e2594-encryption-config\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.676839 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5lrgb"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.680935 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5lrgb" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.685738 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd61313c-fbd2-486b-96c7-1f27ac8a3ac5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-79tl2\" (UID: \"dd61313c-fbd2-486b-96c7-1f27ac8a3ac5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.685750 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.687601 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.688080 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.688668 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.692869 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wx2ts"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.694044 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.697628 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.699246 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.701458 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.705507 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-snrcw"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.706634 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-snrcw" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.709768 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.710830 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.715237 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.715296 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.716671 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.717309 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.717370 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.717854 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.719557 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fhksq"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.720665 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.720917 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-85rbp"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.722304 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.725763 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.726866 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5tbtc"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.728806 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-scclv"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.730852 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.731315 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.732940 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-md7dg"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.734406 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tspt2"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.739781 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ws7lj"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.740049 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mx28l"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.741267 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.742248 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.744805 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tfjll"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.746204 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.746861 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x7vxp"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.748235 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.749838 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.751306 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2cd62"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.752666 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.755164 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.756286 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.757437 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wx2ts"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.758867 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759107 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7e2d746-634f-4c1a-9d70-9a61db901650-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-j876b\" (UID: \"c7e2d746-634f-4c1a-9d70-9a61db901650\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759161 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/feb3e51e-2635-4659-bdb6-c3e72ed63b41-etcd-service-ca\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759200 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqchc\" (UniqueName: \"kubernetes.io/projected/95726c08-64b5-4c14-9eed-81815ea8efcb-kube-api-access-fqchc\") pod \"cluster-image-registry-operator-dc59b4c8b-whh9t\" (UID: \"95726c08-64b5-4c14-9eed-81815ea8efcb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759225 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759268 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759293 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-service-ca\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759317 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlkp6\" (UniqueName: \"kubernetes.io/projected/a86e9a60-2314-425d-acae-d6611ca8b181-kube-api-access-xlkp6\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759351 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564bf8fe-2efd-4e47-bbf5-f0dea6402178-config\") pod \"console-operator-58897d9998-md7dg\" (UID: \"564bf8fe-2efd-4e47-bbf5-f0dea6402178\") " pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759378 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-oauth-serving-cert\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759405 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flj66\" (UniqueName: \"kubernetes.io/projected/2c4ec43d-4942-442e-8a64-78e724700938-kube-api-access-flj66\") pod \"machine-approver-56656f9798-l6777\" (UID: \"2c4ec43d-4942-442e-8a64-78e724700938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759440 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/feb3e51e-2635-4659-bdb6-c3e72ed63b41-etcd-ca\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759464 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-config\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759497 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56bea571-93fc-4c52-aeef-39c979dfd095-serving-cert\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759522 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759554 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-serving-cert\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759582 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca8b735a-2235-4ff8-920e-f40483600c05-config\") pod \"kube-controller-manager-operator-78b949d7b-4p4nt\" (UID: \"ca8b735a-2235-4ff8-920e-f40483600c05\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759609 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56bea571-93fc-4c52-aeef-39c979dfd095-service-ca-bundle\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759655 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/564bf8fe-2efd-4e47-bbf5-f0dea6402178-trusted-ca\") pod \"console-operator-58897d9998-md7dg\" (UID: \"564bf8fe-2efd-4e47-bbf5-f0dea6402178\") " pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759687 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-audit-policies\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759723 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95726c08-64b5-4c14-9eed-81815ea8efcb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-whh9t\" (UID: \"95726c08-64b5-4c14-9eed-81815ea8efcb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759767 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759789 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs2w7\" (UniqueName: \"kubernetes.io/projected/54963e78-1698-4be2-925c-be7dc08c34a6-kube-api-access-rs2w7\") pod \"migrator-59844c95c7-x7vxp\" (UID: \"54963e78-1698-4be2-925c-be7dc08c34a6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x7vxp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759827 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408e8313-53b0-4848-9d70-c99eaa88d122-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nblc8\" (UID: \"408e8313-53b0-4848-9d70-c99eaa88d122\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759854 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2c4ec43d-4942-442e-8a64-78e724700938-auth-proxy-config\") pod \"machine-approver-56656f9798-l6777\" (UID: \"2c4ec43d-4942-442e-8a64-78e724700938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759923 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8k2n\" (UniqueName: \"kubernetes.io/projected/564bf8fe-2efd-4e47-bbf5-f0dea6402178-kube-api-access-t8k2n\") pod \"console-operator-58897d9998-md7dg\" (UID: \"564bf8fe-2efd-4e47-bbf5-f0dea6402178\") " pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759945 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb92g\" (UniqueName: \"kubernetes.io/projected/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-kube-api-access-qb92g\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.759975 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95726c08-64b5-4c14-9eed-81815ea8efcb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-whh9t\" (UID: \"95726c08-64b5-4c14-9eed-81815ea8efcb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760004 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-image-import-ca\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760027 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvtdr\" (UniqueName: \"kubernetes.io/projected/56bea571-93fc-4c52-aeef-39c979dfd095-kube-api-access-qvtdr\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760056 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1e4319f-4808-4c3b-8dfb-4002f1bd7885-metrics-tls\") pod \"dns-operator-744455d44c-5tbtc\" (UID: \"d1e4319f-4808-4c3b-8dfb-4002f1bd7885\") " pod="openshift-dns-operator/dns-operator-744455d44c-5tbtc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760089 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a86e9a60-2314-425d-acae-d6611ca8b181-encryption-config\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760110 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzvbx\" (UniqueName: \"kubernetes.io/projected/77a21683-69d1-4459-aa95-cf4f0d33ec19-kube-api-access-gzvbx\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760139 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/feb3e51e-2635-4659-bdb6-c3e72ed63b41-serving-cert\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760166 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760190 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-config\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760224 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a86e9a60-2314-425d-acae-d6611ca8b181-audit-dir\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760275 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e2d746-634f-4c1a-9d70-9a61db901650-config\") pod \"kube-apiserver-operator-766d6c64bb-j876b\" (UID: \"c7e2d746-634f-4c1a-9d70-9a61db901650\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760296 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feb3e51e-2635-4659-bdb6-c3e72ed63b41-config\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760319 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760343 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c4ec43d-4942-442e-8a64-78e724700938-config\") pod \"machine-approver-56656f9798-l6777\" (UID: \"2c4ec43d-4942-442e-8a64-78e724700938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760365 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564bf8fe-2efd-4e47-bbf5-f0dea6402178-serving-cert\") pod \"console-operator-58897d9998-md7dg\" (UID: \"564bf8fe-2efd-4e47-bbf5-f0dea6402178\") " pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760388 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mknbr\" (UniqueName: \"kubernetes.io/projected/427fbe21-4cfb-4e3f-868f-6b40ab37f9f6-kube-api-access-mknbr\") pod \"kube-storage-version-migrator-operator-b67b599dd-59dcc\" (UID: \"427fbe21-4cfb-4e3f-868f-6b40ab37f9f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760408 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760423 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760436 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56bea571-93fc-4c52-aeef-39c979dfd095-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760472 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760488 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a86e9a60-2314-425d-acae-d6611ca8b181-node-pullsecrets\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760501 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a86e9a60-2314-425d-acae-d6611ca8b181-serving-cert\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760517 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/427fbe21-4cfb-4e3f-868f-6b40ab37f9f6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-59dcc\" (UID: \"427fbe21-4cfb-4e3f-868f-6b40ab37f9f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760535 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760551 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-oauth-config\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760565 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-trusted-ca-bundle\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760580 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4dd8\" (UniqueName: \"kubernetes.io/projected/8a73fa67-f017-4a93-a8f5-6d2f753dcb37-kube-api-access-q4dd8\") pod \"downloads-7954f5f757-mx28l\" (UID: \"8a73fa67-f017-4a93-a8f5-6d2f753dcb37\") " pod="openshift-console/downloads-7954f5f757-mx28l" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760595 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-etcd-serving-ca\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760612 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/95726c08-64b5-4c14-9eed-81815ea8efcb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-whh9t\" (UID: \"95726c08-64b5-4c14-9eed-81815ea8efcb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760627 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a86e9a60-2314-425d-acae-d6611ca8b181-etcd-client\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760645 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2c4ec43d-4942-442e-8a64-78e724700938-machine-approver-tls\") pod \"machine-approver-56656f9798-l6777\" (UID: \"2c4ec43d-4942-442e-8a64-78e724700938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760659 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/feb3e51e-2635-4659-bdb6-c3e72ed63b41-etcd-client\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760674 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p2ps\" (UniqueName: \"kubernetes.io/projected/408e8313-53b0-4848-9d70-c99eaa88d122-kube-api-access-8p2ps\") pod \"openshift-apiserver-operator-796bbdcf4f-nblc8\" (UID: \"408e8313-53b0-4848-9d70-c99eaa88d122\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760689 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca8b735a-2235-4ff8-920e-f40483600c05-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4p4nt\" (UID: \"ca8b735a-2235-4ff8-920e-f40483600c05\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760724 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8b735a-2235-4ff8-920e-f40483600c05-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4p4nt\" (UID: \"ca8b735a-2235-4ff8-920e-f40483600c05\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760746 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760883 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p49xt"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760921 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-audit\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.760964 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/427fbe21-4cfb-4e3f-868f-6b40ab37f9f6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-59dcc\" (UID: \"427fbe21-4cfb-4e3f-868f-6b40ab37f9f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.761000 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzw4q\" (UniqueName: \"kubernetes.io/projected/feb3e51e-2635-4659-bdb6-c3e72ed63b41-kube-api-access-pzw4q\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.761023 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-audit-dir\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.761060 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdhd2\" (UniqueName: \"kubernetes.io/projected/d1e4319f-4808-4c3b-8dfb-4002f1bd7885-kube-api-access-wdhd2\") pod \"dns-operator-744455d44c-5tbtc\" (UID: \"d1e4319f-4808-4c3b-8dfb-4002f1bd7885\") " pod="openshift-dns-operator/dns-operator-744455d44c-5tbtc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.761098 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.761154 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56bea571-93fc-4c52-aeef-39c979dfd095-config\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.761182 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/408e8313-53b0-4848-9d70-c99eaa88d122-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nblc8\" (UID: \"408e8313-53b0-4848-9d70-c99eaa88d122\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.761198 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7e2d746-634f-4c1a-9d70-9a61db901650-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-j876b\" (UID: \"c7e2d746-634f-4c1a-9d70-9a61db901650\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.761537 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-kwcbv"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.762034 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kwcbv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.762569 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wgrx5"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.762981 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wgrx5" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.763722 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.764596 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-snrcw"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.765597 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6tfzr"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.766196 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.766751 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.767771 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5lrgb"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.769569 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.771317 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wgrx5"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.772392 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l546h"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.773799 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.777463 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fhksq"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.779958 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.780814 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fj2g8"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.782487 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fj2g8" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.784459 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fj2g8"] Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.786674 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.806652 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.826909 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.846730 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.862180 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4dd8\" (UniqueName: \"kubernetes.io/projected/8a73fa67-f017-4a93-a8f5-6d2f753dcb37-kube-api-access-q4dd8\") pod \"downloads-7954f5f757-mx28l\" (UID: \"8a73fa67-f017-4a93-a8f5-6d2f753dcb37\") " pod="openshift-console/downloads-7954f5f757-mx28l" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.862236 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/95726c08-64b5-4c14-9eed-81815ea8efcb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-whh9t\" (UID: \"95726c08-64b5-4c14-9eed-81815ea8efcb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.862259 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a86e9a60-2314-425d-acae-d6611ca8b181-etcd-client\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.862292 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-etcd-serving-ca\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.862974 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2c4ec43d-4942-442e-8a64-78e724700938-machine-approver-tls\") pod \"machine-approver-56656f9798-l6777\" (UID: \"2c4ec43d-4942-442e-8a64-78e724700938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863009 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/feb3e51e-2635-4659-bdb6-c3e72ed63b41-etcd-client\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863029 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p2ps\" (UniqueName: \"kubernetes.io/projected/408e8313-53b0-4848-9d70-c99eaa88d122-kube-api-access-8p2ps\") pod \"openshift-apiserver-operator-796bbdcf4f-nblc8\" (UID: \"408e8313-53b0-4848-9d70-c99eaa88d122\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863067 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca8b735a-2235-4ff8-920e-f40483600c05-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4p4nt\" (UID: \"ca8b735a-2235-4ff8-920e-f40483600c05\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863083 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8b735a-2235-4ff8-920e-f40483600c05-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4p4nt\" (UID: \"ca8b735a-2235-4ff8-920e-f40483600c05\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863097 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863113 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-audit\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863133 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/427fbe21-4cfb-4e3f-868f-6b40ab37f9f6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-59dcc\" (UID: \"427fbe21-4cfb-4e3f-868f-6b40ab37f9f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863149 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzw4q\" (UniqueName: \"kubernetes.io/projected/feb3e51e-2635-4659-bdb6-c3e72ed63b41-kube-api-access-pzw4q\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863201 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-audit-dir\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863220 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdhd2\" (UniqueName: \"kubernetes.io/projected/d1e4319f-4808-4c3b-8dfb-4002f1bd7885-kube-api-access-wdhd2\") pod \"dns-operator-744455d44c-5tbtc\" (UID: \"d1e4319f-4808-4c3b-8dfb-4002f1bd7885\") " pod="openshift-dns-operator/dns-operator-744455d44c-5tbtc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863236 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863253 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56bea571-93fc-4c52-aeef-39c979dfd095-config\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863269 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/408e8313-53b0-4848-9d70-c99eaa88d122-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nblc8\" (UID: \"408e8313-53b0-4848-9d70-c99eaa88d122\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863284 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7e2d746-634f-4c1a-9d70-9a61db901650-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-j876b\" (UID: \"c7e2d746-634f-4c1a-9d70-9a61db901650\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863300 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/feb3e51e-2635-4659-bdb6-c3e72ed63b41-etcd-service-ca\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863317 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqchc\" (UniqueName: \"kubernetes.io/projected/95726c08-64b5-4c14-9eed-81815ea8efcb-kube-api-access-fqchc\") pod \"cluster-image-registry-operator-dc59b4c8b-whh9t\" (UID: \"95726c08-64b5-4c14-9eed-81815ea8efcb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863335 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863342 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-etcd-serving-ca\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863350 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7e2d746-634f-4c1a-9d70-9a61db901650-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-j876b\" (UID: \"c7e2d746-634f-4c1a-9d70-9a61db901650\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863428 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-service-ca\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863457 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlkp6\" (UniqueName: \"kubernetes.io/projected/a86e9a60-2314-425d-acae-d6611ca8b181-kube-api-access-xlkp6\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863484 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863522 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564bf8fe-2efd-4e47-bbf5-f0dea6402178-config\") pod \"console-operator-58897d9998-md7dg\" (UID: \"564bf8fe-2efd-4e47-bbf5-f0dea6402178\") " pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863559 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-oauth-serving-cert\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863588 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flj66\" (UniqueName: \"kubernetes.io/projected/2c4ec43d-4942-442e-8a64-78e724700938-kube-api-access-flj66\") pod \"machine-approver-56656f9798-l6777\" (UID: \"2c4ec43d-4942-442e-8a64-78e724700938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863613 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/feb3e51e-2635-4659-bdb6-c3e72ed63b41-etcd-ca\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863648 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-config\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863671 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56bea571-93fc-4c52-aeef-39c979dfd095-serving-cert\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863696 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863741 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-serving-cert\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863765 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca8b735a-2235-4ff8-920e-f40483600c05-config\") pod \"kube-controller-manager-operator-78b949d7b-4p4nt\" (UID: \"ca8b735a-2235-4ff8-920e-f40483600c05\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863805 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/564bf8fe-2efd-4e47-bbf5-f0dea6402178-trusted-ca\") pod \"console-operator-58897d9998-md7dg\" (UID: \"564bf8fe-2efd-4e47-bbf5-f0dea6402178\") " pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863828 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-audit-policies\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863850 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95726c08-64b5-4c14-9eed-81815ea8efcb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-whh9t\" (UID: \"95726c08-64b5-4c14-9eed-81815ea8efcb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.863906 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56bea571-93fc-4c52-aeef-39c979dfd095-service-ca-bundle\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864002 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864032 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs2w7\" (UniqueName: \"kubernetes.io/projected/54963e78-1698-4be2-925c-be7dc08c34a6-kube-api-access-rs2w7\") pod \"migrator-59844c95c7-x7vxp\" (UID: \"54963e78-1698-4be2-925c-be7dc08c34a6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x7vxp" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864057 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408e8313-53b0-4848-9d70-c99eaa88d122-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nblc8\" (UID: \"408e8313-53b0-4848-9d70-c99eaa88d122\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864081 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2c4ec43d-4942-442e-8a64-78e724700938-auth-proxy-config\") pod \"machine-approver-56656f9798-l6777\" (UID: \"2c4ec43d-4942-442e-8a64-78e724700938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864118 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8k2n\" (UniqueName: \"kubernetes.io/projected/564bf8fe-2efd-4e47-bbf5-f0dea6402178-kube-api-access-t8k2n\") pod \"console-operator-58897d9998-md7dg\" (UID: \"564bf8fe-2efd-4e47-bbf5-f0dea6402178\") " pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864144 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb92g\" (UniqueName: \"kubernetes.io/projected/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-kube-api-access-qb92g\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864166 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95726c08-64b5-4c14-9eed-81815ea8efcb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-whh9t\" (UID: \"95726c08-64b5-4c14-9eed-81815ea8efcb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864188 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-image-import-ca\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864211 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvtdr\" (UniqueName: \"kubernetes.io/projected/56bea571-93fc-4c52-aeef-39c979dfd095-kube-api-access-qvtdr\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864244 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1e4319f-4808-4c3b-8dfb-4002f1bd7885-metrics-tls\") pod \"dns-operator-744455d44c-5tbtc\" (UID: \"d1e4319f-4808-4c3b-8dfb-4002f1bd7885\") " pod="openshift-dns-operator/dns-operator-744455d44c-5tbtc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864268 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a86e9a60-2314-425d-acae-d6611ca8b181-encryption-config\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864292 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/feb3e51e-2635-4659-bdb6-c3e72ed63b41-serving-cert\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864315 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864342 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-config\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864364 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzvbx\" (UniqueName: \"kubernetes.io/projected/77a21683-69d1-4459-aa95-cf4f0d33ec19-kube-api-access-gzvbx\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864392 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a86e9a60-2314-425d-acae-d6611ca8b181-audit-dir\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864428 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e2d746-634f-4c1a-9d70-9a61db901650-config\") pod \"kube-apiserver-operator-766d6c64bb-j876b\" (UID: \"c7e2d746-634f-4c1a-9d70-9a61db901650\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864465 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feb3e51e-2635-4659-bdb6-c3e72ed63b41-config\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864491 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864514 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c4ec43d-4942-442e-8a64-78e724700938-config\") pod \"machine-approver-56656f9798-l6777\" (UID: \"2c4ec43d-4942-442e-8a64-78e724700938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864543 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mknbr\" (UniqueName: \"kubernetes.io/projected/427fbe21-4cfb-4e3f-868f-6b40ab37f9f6-kube-api-access-mknbr\") pod \"kube-storage-version-migrator-operator-b67b599dd-59dcc\" (UID: \"427fbe21-4cfb-4e3f-868f-6b40ab37f9f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864569 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864595 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864619 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56bea571-93fc-4c52-aeef-39c979dfd095-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864642 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564bf8fe-2efd-4e47-bbf5-f0dea6402178-serving-cert\") pod \"console-operator-58897d9998-md7dg\" (UID: \"564bf8fe-2efd-4e47-bbf5-f0dea6402178\") " pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864679 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864721 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a86e9a60-2314-425d-acae-d6611ca8b181-node-pullsecrets\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864758 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/427fbe21-4cfb-4e3f-868f-6b40ab37f9f6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-59dcc\" (UID: \"427fbe21-4cfb-4e3f-868f-6b40ab37f9f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864783 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864810 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-oauth-config\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864833 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-trusted-ca-bundle\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.864856 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a86e9a60-2314-425d-acae-d6611ca8b181-serving-cert\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.865627 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2c4ec43d-4942-442e-8a64-78e724700938-machine-approver-tls\") pod \"machine-approver-56656f9798-l6777\" (UID: \"2c4ec43d-4942-442e-8a64-78e724700938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.865920 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-config\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.865955 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/feb3e51e-2635-4659-bdb6-c3e72ed63b41-etcd-client\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.866638 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408e8313-53b0-4848-9d70-c99eaa88d122-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nblc8\" (UID: \"408e8313-53b0-4848-9d70-c99eaa88d122\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.865979 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c4ec43d-4942-442e-8a64-78e724700938-config\") pod \"machine-approver-56656f9798-l6777\" (UID: \"2c4ec43d-4942-442e-8a64-78e724700938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.867133 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a86e9a60-2314-425d-acae-d6611ca8b181-node-pullsecrets\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.865962 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-image-import-ca\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.867307 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/564bf8fe-2efd-4e47-bbf5-f0dea6402178-trusted-ca\") pod \"console-operator-58897d9998-md7dg\" (UID: \"564bf8fe-2efd-4e47-bbf5-f0dea6402178\") " pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.867382 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.867788 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-audit-dir\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.867857 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.868011 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a86e9a60-2314-425d-acae-d6611ca8b181-serving-cert\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.868265 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-audit\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.868455 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56bea571-93fc-4c52-aeef-39c979dfd095-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.868747 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56bea571-93fc-4c52-aeef-39c979dfd095-service-ca-bundle\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.868826 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a86e9a60-2314-425d-acae-d6611ca8b181-audit-dir\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.868908 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.869318 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/427fbe21-4cfb-4e3f-868f-6b40ab37f9f6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-59dcc\" (UID: \"427fbe21-4cfb-4e3f-868f-6b40ab37f9f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.869456 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a86e9a60-2314-425d-acae-d6611ca8b181-etcd-client\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.869666 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56bea571-93fc-4c52-aeef-39c979dfd095-config\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.869767 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feb3e51e-2635-4659-bdb6-c3e72ed63b41-config\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.869864 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.870144 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/564bf8fe-2efd-4e47-bbf5-f0dea6402178-config\") pod \"console-operator-58897d9998-md7dg\" (UID: \"564bf8fe-2efd-4e47-bbf5-f0dea6402178\") " pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.870167 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a86e9a60-2314-425d-acae-d6611ca8b181-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.870253 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-audit-policies\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.870299 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95726c08-64b5-4c14-9eed-81815ea8efcb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-whh9t\" (UID: \"95726c08-64b5-4c14-9eed-81815ea8efcb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.870419 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2c4ec43d-4942-442e-8a64-78e724700938-auth-proxy-config\") pod \"machine-approver-56656f9798-l6777\" (UID: \"2c4ec43d-4942-442e-8a64-78e724700938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.870877 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/427fbe21-4cfb-4e3f-868f-6b40ab37f9f6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-59dcc\" (UID: \"427fbe21-4cfb-4e3f-868f-6b40ab37f9f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.871141 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/408e8313-53b0-4848-9d70-c99eaa88d122-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nblc8\" (UID: \"408e8313-53b0-4848-9d70-c99eaa88d122\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.871412 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.871424 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.872037 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.872267 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/feb3e51e-2635-4659-bdb6-c3e72ed63b41-serving-cert\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.872538 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56bea571-93fc-4c52-aeef-39c979dfd095-serving-cert\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.873106 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.873483 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.873643 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.874250 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.874531 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1e4319f-4808-4c3b-8dfb-4002f1bd7885-metrics-tls\") pod \"dns-operator-744455d44c-5tbtc\" (UID: \"d1e4319f-4808-4c3b-8dfb-4002f1bd7885\") " pod="openshift-dns-operator/dns-operator-744455d44c-5tbtc" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.874621 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564bf8fe-2efd-4e47-bbf5-f0dea6402178-serving-cert\") pod \"console-operator-58897d9998-md7dg\" (UID: \"564bf8fe-2efd-4e47-bbf5-f0dea6402178\") " pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.875442 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.875595 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a86e9a60-2314-425d-acae-d6611ca8b181-encryption-config\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.887052 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.913920 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.919471 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/feb3e51e-2635-4659-bdb6-c3e72ed63b41-etcd-service-ca\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.927008 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.929038 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e2d746-634f-4c1a-9d70-9a61db901650-config\") pod \"kube-apiserver-operator-766d6c64bb-j876b\" (UID: \"c7e2d746-634f-4c1a-9d70-9a61db901650\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.947016 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.966779 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.976168 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7e2d746-634f-4c1a-9d70-9a61db901650-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-j876b\" (UID: \"c7e2d746-634f-4c1a-9d70-9a61db901650\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b" Jan 30 21:16:50 crc kubenswrapper[4914]: I0130 21:16:50.986315 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.006556 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.010151 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-oauth-serving-cert\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.027353 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.033023 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-oauth-config\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.048395 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.056312 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/feb3e51e-2635-4659-bdb6-c3e72ed63b41-etcd-ca\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.066555 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.086539 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.094239 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-serving-cert\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.107631 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.110222 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-config\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.126615 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.137180 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-service-ca\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.153505 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.160952 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-trusted-ca-bundle\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.167869 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.189049 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.208020 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.227379 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.237800 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca8b735a-2235-4ff8-920e-f40483600c05-config\") pod \"kube-controller-manager-operator-78b949d7b-4p4nt\" (UID: \"ca8b735a-2235-4ff8-920e-f40483600c05\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.247370 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.266570 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.287643 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.299384 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/95726c08-64b5-4c14-9eed-81815ea8efcb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-whh9t\" (UID: \"95726c08-64b5-4c14-9eed-81815ea8efcb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.307617 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.328455 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.341157 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8b735a-2235-4ff8-920e-f40483600c05-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4p4nt\" (UID: \"ca8b735a-2235-4ff8-920e-f40483600c05\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.369360 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7jvj\" (UniqueName: \"kubernetes.io/projected/247e526c-e643-4ffb-a6f2-b4678132b8a7-kube-api-access-r7jvj\") pod \"route-controller-manager-6576b87f9c-htz2z\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.384551 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zftk\" (UniqueName: \"kubernetes.io/projected/dd61313c-fbd2-486b-96c7-1f27ac8a3ac5-kube-api-access-4zftk\") pod \"openshift-controller-manager-operator-756b6f6bc6-79tl2\" (UID: \"dd61313c-fbd2-486b-96c7-1f27ac8a3ac5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.427947 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.461965 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ttp8\" (UniqueName: \"kubernetes.io/projected/8b850795-7fca-417d-9e31-c319e45e2594-kube-api-access-8ttp8\") pod \"apiserver-7bbb656c7d-h9fmd\" (UID: \"8b850795-7fca-417d-9e31-c319e45e2594\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.480741 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.503903 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.527036 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.528174 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77g9c\" (UniqueName: \"kubernetes.io/projected/bdabe348-b2e8-4c4c-a3d8-c5827a94e615-kube-api-access-77g9c\") pod \"openshift-config-operator-7777fb866f-tspt2\" (UID: \"bdabe348-b2e8-4c4c-a3d8-c5827a94e615\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.536533 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5t5k\" (UniqueName: \"kubernetes.io/projected/dff3a310-c986-4724-8862-6d609edb8612-kube-api-access-l5t5k\") pod \"controller-manager-879f6c89f-pscbd\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.548203 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.567155 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.588751 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.608029 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.629285 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.648854 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 30 21:16:51 crc kubenswrapper[4914]: E0130 21:16:51.655575 4914 secret.go:188] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Jan 30 21:16:51 crc kubenswrapper[4914]: E0130 21:16:51.655658 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3efefb06-dccd-4432-8a91-9ac951803c21-samples-operator-tls podName:3efefb06-dccd-4432-8a91-9ac951803c21 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:52.155635171 +0000 UTC m=+145.594271942 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3efefb06-dccd-4432-8a91-9ac951803c21-samples-operator-tls") pod "cluster-samples-operator-665b6dd947-648dg" (UID: "3efefb06-dccd-4432-8a91-9ac951803c21") : failed to sync secret cache: timed out waiting for the condition Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.674039 4914 request.go:700] Waited for 1.002899613s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.678967 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.687520 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.699497 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.708630 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.713800 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z"] Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.727073 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.746595 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.765570 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2"] Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.766249 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.786512 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.795808 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd"] Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.806557 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.817617 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.826090 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.846676 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.851320 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pscbd"] Jan 30 21:16:51 crc kubenswrapper[4914]: W0130 21:16:51.864654 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddff3a310_c986_4724_8862_6d609edb8612.slice/crio-6290973f00dcb777e9b3f56eae8a34f71120f7569285ff79653465dcb11c21f2 WatchSource:0}: Error finding container 6290973f00dcb777e9b3f56eae8a34f71120f7569285ff79653465dcb11c21f2: Status 404 returned error can't find the container with id 6290973f00dcb777e9b3f56eae8a34f71120f7569285ff79653465dcb11c21f2 Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.866538 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.886154 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.910328 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.926726 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.950128 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.966788 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.986699 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 30 21:16:51 crc kubenswrapper[4914]: I0130 21:16:51.994055 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tspt2"] Jan 30 21:16:52 crc kubenswrapper[4914]: W0130 21:16:52.004537 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdabe348_b2e8_4c4c_a3d8_c5827a94e615.slice/crio-f85c7841d53be3fb634244dbd9ec797ae9d6e14011448889c0858a977035ab04 WatchSource:0}: Error finding container f85c7841d53be3fb634244dbd9ec797ae9d6e14011448889c0858a977035ab04: Status 404 returned error can't find the container with id f85c7841d53be3fb634244dbd9ec797ae9d6e14011448889c0858a977035ab04 Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.007056 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.030119 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.046629 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.066792 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.093588 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.107643 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.127429 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.147115 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.167457 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.185830 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3efefb06-dccd-4432-8a91-9ac951803c21-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-648dg\" (UID: \"3efefb06-dccd-4432-8a91-9ac951803c21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.187194 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.209852 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.226692 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.247993 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.267435 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.287244 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.307128 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.327124 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.347528 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.368437 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.387851 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.408119 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.427122 4914 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 30 21:16:52 crc kubenswrapper[4914]: E0130 21:16:52.439501 4914 projected.go:288] Couldn't get configMap openshift-cluster-samples-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.446937 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.467193 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 30 21:16:52 crc kubenswrapper[4914]: E0130 21:16:52.478867 4914 projected.go:288] Couldn't get configMap openshift-machine-api/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.488363 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.507010 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.527452 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.547457 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.567021 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.587155 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.606413 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.618196 4914 generic.go:334] "Generic (PLEG): container finished" podID="bdabe348-b2e8-4c4c-a3d8-c5827a94e615" containerID="9cedbe9b2acdebc6bc96b21cf2fc4506e84669ce7189b45da1e7670e9c958593" exitCode=0 Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.618253 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" event={"ID":"bdabe348-b2e8-4c4c-a3d8-c5827a94e615","Type":"ContainerDied","Data":"9cedbe9b2acdebc6bc96b21cf2fc4506e84669ce7189b45da1e7670e9c958593"} Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.618312 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" event={"ID":"bdabe348-b2e8-4c4c-a3d8-c5827a94e615","Type":"ContainerStarted","Data":"f85c7841d53be3fb634244dbd9ec797ae9d6e14011448889c0858a977035ab04"} Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.619663 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" event={"ID":"dff3a310-c986-4724-8862-6d609edb8612","Type":"ContainerStarted","Data":"917f63e4bb780d9637d1222a52f82e6541b5d41edd0b80b535d176b614e455b1"} Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.619680 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" event={"ID":"dff3a310-c986-4724-8862-6d609edb8612","Type":"ContainerStarted","Data":"6290973f00dcb777e9b3f56eae8a34f71120f7569285ff79653465dcb11c21f2"} Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.620060 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.621610 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2" event={"ID":"dd61313c-fbd2-486b-96c7-1f27ac8a3ac5","Type":"ContainerStarted","Data":"6302a2b62dc5f7954c73af145454efe68800950a25a66df676c594cab7922b6e"} Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.621630 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2" event={"ID":"dd61313c-fbd2-486b-96c7-1f27ac8a3ac5","Type":"ContainerStarted","Data":"b82ee3728db88ff9cbb295dc00d2b50e26cf46dc5f5604784cd95d31e537ce69"} Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.623275 4914 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pscbd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.623343 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" podUID="dff3a310-c986-4724-8862-6d609edb8612" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.623562 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" event={"ID":"247e526c-e643-4ffb-a6f2-b4678132b8a7","Type":"ContainerStarted","Data":"6c7673879cc11c7b85f34981f7ca4377f62dc33e66b668449724f01aceffef7f"} Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.623593 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" event={"ID":"247e526c-e643-4ffb-a6f2-b4678132b8a7","Type":"ContainerStarted","Data":"2bd09aace6ad40f8b2797f0d33b32e98acb09b4c8205d69edda8ecb6f0ab792c"} Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.623790 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.626007 4914 generic.go:334] "Generic (PLEG): container finished" podID="8b850795-7fca-417d-9e31-c319e45e2594" containerID="80cf8a1033cd8385f48e45786d50f62b51e040370d5c98a7b3259ea6cf2d91c5" exitCode=0 Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.626058 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" event={"ID":"8b850795-7fca-417d-9e31-c319e45e2594","Type":"ContainerDied","Data":"80cf8a1033cd8385f48e45786d50f62b51e040370d5c98a7b3259ea6cf2d91c5"} Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.626087 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" event={"ID":"8b850795-7fca-417d-9e31-c319e45e2594","Type":"ContainerStarted","Data":"9702542aa13d82c61b19a52ced9fd0cfa99814825b9980a0d9b13aa44fd0ece4"} Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.627195 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.647551 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.667557 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.685543 4914 request.go:700] Waited for 1.82313498s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/default/token Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.710523 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4dd8\" (UniqueName: \"kubernetes.io/projected/8a73fa67-f017-4a93-a8f5-6d2f753dcb37-kube-api-access-q4dd8\") pod \"downloads-7954f5f757-mx28l\" (UID: \"8a73fa67-f017-4a93-a8f5-6d2f753dcb37\") " pod="openshift-console/downloads-7954f5f757-mx28l" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.725280 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs2w7\" (UniqueName: \"kubernetes.io/projected/54963e78-1698-4be2-925c-be7dc08c34a6-kube-api-access-rs2w7\") pod \"migrator-59844c95c7-x7vxp\" (UID: \"54963e78-1698-4be2-925c-be7dc08c34a6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x7vxp" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.745773 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flj66\" (UniqueName: \"kubernetes.io/projected/2c4ec43d-4942-442e-8a64-78e724700938-kube-api-access-flj66\") pod \"machine-approver-56656f9798-l6777\" (UID: \"2c4ec43d-4942-442e-8a64-78e724700938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.763374 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlkp6\" (UniqueName: \"kubernetes.io/projected/a86e9a60-2314-425d-acae-d6611ca8b181-kube-api-access-xlkp6\") pod \"apiserver-76f77b778f-tfjll\" (UID: \"a86e9a60-2314-425d-acae-d6611ca8b181\") " pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.783807 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvtdr\" (UniqueName: \"kubernetes.io/projected/56bea571-93fc-4c52-aeef-39c979dfd095-kube-api-access-qvtdr\") pod \"authentication-operator-69f744f599-ws7lj\" (UID: \"56bea571-93fc-4c52-aeef-39c979dfd095\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.802389 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p2ps\" (UniqueName: \"kubernetes.io/projected/408e8313-53b0-4848-9d70-c99eaa88d122-kube-api-access-8p2ps\") pod \"openshift-apiserver-operator-796bbdcf4f-nblc8\" (UID: \"408e8313-53b0-4848-9d70-c99eaa88d122\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.809054 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.815488 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.816531 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.822289 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca8b735a-2235-4ff8-920e-f40483600c05-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4p4nt\" (UID: \"ca8b735a-2235-4ff8-920e-f40483600c05\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.827409 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.847050 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mknbr\" (UniqueName: \"kubernetes.io/projected/427fbe21-4cfb-4e3f-868f-6b40ab37f9f6-kube-api-access-mknbr\") pod \"kube-storage-version-migrator-operator-b67b599dd-59dcc\" (UID: \"427fbe21-4cfb-4e3f-868f-6b40ab37f9f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.856920 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mx28l" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.862223 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzvbx\" (UniqueName: \"kubernetes.io/projected/77a21683-69d1-4459-aa95-cf4f0d33ec19-kube-api-access-gzvbx\") pod \"console-f9d7485db-scclv\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.864662 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.879222 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7e2d746-634f-4c1a-9d70-9a61db901650-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-j876b\" (UID: \"c7e2d746-634f-4c1a-9d70-9a61db901650\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.880014 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.896176 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.902070 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.903729 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95726c08-64b5-4c14-9eed-81815ea8efcb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-whh9t\" (UID: \"95726c08-64b5-4c14-9eed-81815ea8efcb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.909603 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x7vxp" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.926773 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqchc\" (UniqueName: \"kubernetes.io/projected/95726c08-64b5-4c14-9eed-81815ea8efcb-kube-api-access-fqchc\") pod \"cluster-image-registry-operator-dc59b4c8b-whh9t\" (UID: \"95726c08-64b5-4c14-9eed-81815ea8efcb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.950931 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdhd2\" (UniqueName: \"kubernetes.io/projected/d1e4319f-4808-4c3b-8dfb-4002f1bd7885-kube-api-access-wdhd2\") pod \"dns-operator-744455d44c-5tbtc\" (UID: \"d1e4319f-4808-4c3b-8dfb-4002f1bd7885\") " pod="openshift-dns-operator/dns-operator-744455d44c-5tbtc" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.952520 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:16:52 crc kubenswrapper[4914]: I0130 21:16:52.964554 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzw4q\" (UniqueName: \"kubernetes.io/projected/feb3e51e-2635-4659-bdb6-c3e72ed63b41-kube-api-access-pzw4q\") pod \"etcd-operator-b45778765-l546h\" (UID: \"feb3e51e-2635-4659-bdb6-c3e72ed63b41\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:52.987041 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb92g\" (UniqueName: \"kubernetes.io/projected/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-kube-api-access-qb92g\") pod \"oauth-openshift-558db77b4-2cd62\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.009145 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.015330 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8k2n\" (UniqueName: \"kubernetes.io/projected/564bf8fe-2efd-4e47-bbf5-f0dea6402178-kube-api-access-t8k2n\") pod \"console-operator-58897d9998-md7dg\" (UID: \"564bf8fe-2efd-4e47-bbf5-f0dea6402178\") " pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.039639 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.062103 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.067382 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.071863 4914 projected.go:194] Error preparing data for projected volume kube-api-access-djd4k for pod openshift-machine-api/machine-api-operator-5694c8668f-85rbp: failed to sync configmap cache: timed out waiting for the condition Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.071957 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-kube-api-access-djd4k podName:6b3718ea-66f6-4f01-97c5-94c7c844e1a0 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:53.571934267 +0000 UTC m=+147.010571028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-djd4k" (UniqueName: "kubernetes.io/projected/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-kube-api-access-djd4k") pod "machine-api-operator-5694c8668f-85rbp" (UID: "6b3718ea-66f6-4f01-97c5-94c7c844e1a0") : failed to sync configmap cache: timed out waiting for the condition Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.091296 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.102042 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ws7lj"] Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.102282 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.102336 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.102369 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkwh4\" (UniqueName: \"kubernetes.io/projected/401dfb2e-119a-487d-915a-b2bfdb275f74-kube-api-access-dkwh4\") pod \"ingress-operator-5b745b69d9-5stwz\" (UID: \"401dfb2e-119a-487d-915a-b2bfdb275f74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.102395 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-registry-certificates\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.102428 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-registry-tls\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.102474 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-trusted-ca\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.102499 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/401dfb2e-119a-487d-915a-b2bfdb275f74-trusted-ca\") pod \"ingress-operator-5b745b69d9-5stwz\" (UID: \"401dfb2e-119a-487d-915a-b2bfdb275f74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.102566 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/401dfb2e-119a-487d-915a-b2bfdb275f74-metrics-tls\") pod \"ingress-operator-5b745b69d9-5stwz\" (UID: \"401dfb2e-119a-487d-915a-b2bfdb275f74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.102592 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr5b6\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-kube-api-access-jr5b6\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.102623 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.102649 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-bound-sa-token\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.102674 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/401dfb2e-119a-487d-915a-b2bfdb275f74-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5stwz\" (UID: \"401dfb2e-119a-487d-915a-b2bfdb275f74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.103079 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:53.60306561 +0000 UTC m=+147.041702371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.106489 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.120226 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3efefb06-dccd-4432-8a91-9ac951803c21-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-648dg\" (UID: \"3efefb06-dccd-4432-8a91-9ac951803c21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.126725 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.135552 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.148441 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.149866 4914 projected.go:194] Error preparing data for projected volume kube-api-access-jz4sj for pod openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg: failed to sync configmap cache: timed out waiting for the condition Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.150129 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3efefb06-dccd-4432-8a91-9ac951803c21-kube-api-access-jz4sj podName:3efefb06-dccd-4432-8a91-9ac951803c21 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:53.649934913 +0000 UTC m=+147.088571674 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jz4sj" (UniqueName: "kubernetes.io/projected/3efefb06-dccd-4432-8a91-9ac951803c21-kube-api-access-jz4sj") pod "cluster-samples-operator-665b6dd947-648dg" (UID: "3efefb06-dccd-4432-8a91-9ac951803c21") : failed to sync configmap cache: timed out waiting for the condition Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.150192 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5tbtc" Jan 30 21:16:53 crc kubenswrapper[4914]: W0130 21:16:53.157492 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56bea571_93fc_4c52_aeef_39c979dfd095.slice/crio-6527c19ed2a2df2c1870b0419d48cff7d1789097f33c3fd42638abd08cf78cda WatchSource:0}: Error finding container 6527c19ed2a2df2c1870b0419d48cff7d1789097f33c3fd42638abd08cf78cda: Status 404 returned error can't find the container with id 6527c19ed2a2df2c1870b0419d48cff7d1789097f33c3fd42638abd08cf78cda Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.190293 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.203636 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.203798 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a20a3b7-3c40-4816-a5b2-4d756cfbd948-webhook-cert\") pod \"packageserver-d55dfcdfc-w4dnw\" (UID: \"2a20a3b7-3c40-4816-a5b2-4d756cfbd948\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.203893 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/421cd57e-a15d-457a-a515-d071c7720f85-proxy-tls\") pod \"machine-config-controller-84d6567774-hxlrs\" (UID: \"421cd57e-a15d-457a-a515-d071c7720f85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.203913 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-default-certificate\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.203947 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-stats-auth\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.203963 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.203980 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts5hw\" (UniqueName: \"kubernetes.io/projected/3846973a-e8a4-432a-b800-99b21bc0a93a-kube-api-access-ts5hw\") pod \"package-server-manager-789f6589d5-ptx46\" (UID: \"3846973a-e8a4-432a-b800-99b21bc0a93a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.203996 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhxkl\" (UniqueName: \"kubernetes.io/projected/e050cbd0-653b-4d23-8a69-affa52be9608-kube-api-access-vhxkl\") pod \"collect-profiles-29496795-pwhjd\" (UID: \"e050cbd0-653b-4d23-8a69-affa52be9608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.204027 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8602f25-310b-4d02-af41-fe47753bfcfe-srv-cert\") pod \"olm-operator-6b444d44fb-gwf56\" (UID: \"e8602f25-310b-4d02-af41-fe47753bfcfe\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.204044 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkwh4\" (UniqueName: \"kubernetes.io/projected/401dfb2e-119a-487d-915a-b2bfdb275f74-kube-api-access-dkwh4\") pod \"ingress-operator-5b745b69d9-5stwz\" (UID: \"401dfb2e-119a-487d-915a-b2bfdb275f74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.204064 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-registration-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.204078 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-service-ca-bundle\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.204100 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff6855e6-7cae-432d-a7bf-6d3879ca88c3-metrics-tls\") pod \"dns-default-fj2g8\" (UID: \"ff6855e6-7cae-432d-a7bf-6d3879ca88c3\") " pod="openshift-dns/dns-default-fj2g8" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.204116 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-registry-tls\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.204139 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-socket-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.204616 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:53.704590934 +0000 UTC m=+147.143227715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205201 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0-certs\") pod \"machine-config-server-kwcbv\" (UID: \"dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0\") " pod="openshift-machine-config-operator/machine-config-server-kwcbv" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205328 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a20a3b7-3c40-4816-a5b2-4d756cfbd948-apiservice-cert\") pod \"packageserver-d55dfcdfc-w4dnw\" (UID: \"2a20a3b7-3c40-4816-a5b2-4d756cfbd948\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205362 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2767\" (UniqueName: \"kubernetes.io/projected/dfc55944-5aa9-4e66-b049-d109415b0f5e-kube-api-access-x2767\") pod \"machine-config-operator-74547568cd-r9zkn\" (UID: \"dfc55944-5aa9-4e66-b049-d109415b0f5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205413 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qc7z\" (UniqueName: \"kubernetes.io/projected/f492bda5-ee80-44ea-9c78-42d5bfd959da-kube-api-access-9qc7z\") pod \"multus-admission-controller-857f4d67dd-5lrgb\" (UID: \"f492bda5-ee80-44ea-9c78-42d5bfd959da\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5lrgb" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205450 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff6855e6-7cae-432d-a7bf-6d3879ca88c3-config-volume\") pod \"dns-default-fj2g8\" (UID: \"ff6855e6-7cae-432d-a7bf-6d3879ca88c3\") " pod="openshift-dns/dns-default-fj2g8" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205473 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0-node-bootstrap-token\") pod \"machine-config-server-kwcbv\" (UID: \"dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0\") " pod="openshift-machine-config-operator/machine-config-server-kwcbv" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205539 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/421cd57e-a15d-457a-a515-d071c7720f85-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hxlrs\" (UID: \"421cd57e-a15d-457a-a515-d071c7720f85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205600 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-csi-data-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205636 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f492bda5-ee80-44ea-9c78-42d5bfd959da-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5lrgb\" (UID: \"f492bda5-ee80-44ea-9c78-42d5bfd959da\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5lrgb" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205686 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr5b6\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-kube-api-access-jr5b6\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205728 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xkkz\" (UniqueName: \"kubernetes.io/projected/2a20a3b7-3c40-4816-a5b2-4d756cfbd948-kube-api-access-2xkkz\") pod \"packageserver-d55dfcdfc-w4dnw\" (UID: \"2a20a3b7-3c40-4816-a5b2-4d756cfbd948\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205751 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-metrics-certs\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205789 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-plugins-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205811 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a02a92b6-edd8-4ae5-871b-cea79ac68d5a-signing-key\") pod \"service-ca-9c57cc56f-snrcw\" (UID: \"a02a92b6-edd8-4ae5-871b-cea79ac68d5a\") " pod="openshift-service-ca/service-ca-9c57cc56f-snrcw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205843 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2a20a3b7-3c40-4816-a5b2-4d756cfbd948-tmpfs\") pod \"packageserver-d55dfcdfc-w4dnw\" (UID: \"2a20a3b7-3c40-4816-a5b2-4d756cfbd948\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205863 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205898 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.205924 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-bound-sa-token\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206005 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfc55944-5aa9-4e66-b049-d109415b0f5e-proxy-tls\") pod \"machine-config-operator-74547568cd-r9zkn\" (UID: \"dfc55944-5aa9-4e66-b049-d109415b0f5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206059 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/401dfb2e-119a-487d-915a-b2bfdb275f74-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5stwz\" (UID: \"401dfb2e-119a-487d-915a-b2bfdb275f74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206081 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/489f08b5-b5d7-45a2-98e7-0c26139ed1d9-cert\") pod \"ingress-canary-wgrx5\" (UID: \"489f08b5-b5d7-45a2-98e7-0c26139ed1d9\") " pod="openshift-ingress-canary/ingress-canary-wgrx5" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206105 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb07f77-973c-4d69-b6f4-d250599cf3a3-serving-cert\") pod \"service-ca-operator-777779d784-ww6zz\" (UID: \"5cb07f77-973c-4d69-b6f4-d250599cf3a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206129 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d97l7\" (UniqueName: \"kubernetes.io/projected/c5564eec-0f5f-407c-b34f-3d22c5f9921b-kube-api-access-d97l7\") pod \"catalog-operator-68c6474976-mg7r9\" (UID: \"c5564eec-0f5f-407c-b34f-3d22c5f9921b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206169 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3846973a-e8a4-432a-b800-99b21bc0a93a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ptx46\" (UID: \"3846973a-e8a4-432a-b800-99b21bc0a93a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206190 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e050cbd0-653b-4d23-8a69-affa52be9608-secret-volume\") pod \"collect-profiles-29496795-pwhjd\" (UID: \"e050cbd0-653b-4d23-8a69-affa52be9608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206239 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h9wf\" (UniqueName: \"kubernetes.io/projected/a02a92b6-edd8-4ae5-871b-cea79ac68d5a-kube-api-access-8h9wf\") pod \"service-ca-9c57cc56f-snrcw\" (UID: \"a02a92b6-edd8-4ae5-871b-cea79ac68d5a\") " pod="openshift-service-ca/service-ca-9c57cc56f-snrcw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206264 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wx2ts\" (UID: \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206380 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htcs9\" (UniqueName: \"kubernetes.io/projected/ff6855e6-7cae-432d-a7bf-6d3879ca88c3-kube-api-access-htcs9\") pod \"dns-default-fj2g8\" (UID: \"ff6855e6-7cae-432d-a7bf-6d3879ca88c3\") " pod="openshift-dns/dns-default-fj2g8" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206415 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spb22\" (UniqueName: \"kubernetes.io/projected/489f08b5-b5d7-45a2-98e7-0c26139ed1d9-kube-api-access-spb22\") pod \"ingress-canary-wgrx5\" (UID: \"489f08b5-b5d7-45a2-98e7-0c26139ed1d9\") " pod="openshift-ingress-canary/ingress-canary-wgrx5" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206465 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206487 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a02a92b6-edd8-4ae5-871b-cea79ac68d5a-signing-cabundle\") pod \"service-ca-9c57cc56f-snrcw\" (UID: \"a02a92b6-edd8-4ae5-871b-cea79ac68d5a\") " pod="openshift-service-ca/service-ca-9c57cc56f-snrcw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206520 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-mountpoint-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206543 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3882a6f-456e-4016-b6b1-76a916735c3b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-p49xt\" (UID: \"e3882a6f-456e-4016-b6b1-76a916735c3b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p49xt" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206593 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2j2x\" (UniqueName: \"kubernetes.io/projected/5cb07f77-973c-4d69-b6f4-d250599cf3a3-kube-api-access-b2j2x\") pod \"service-ca-operator-777779d784-ww6zz\" (UID: \"5cb07f77-973c-4d69-b6f4-d250599cf3a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206644 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4frr\" (UniqueName: \"kubernetes.io/projected/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-kube-api-access-r4frr\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206735 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f94df58-70b2-4856-8878-b0fc196d6f6d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cjmd9\" (UID: \"3f94df58-70b2-4856-8878-b0fc196d6f6d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206759 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dfc55944-5aa9-4e66-b049-d109415b0f5e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-r9zkn\" (UID: \"dfc55944-5aa9-4e66-b049-d109415b0f5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206782 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9z9k\" (UniqueName: \"kubernetes.io/projected/e3882a6f-456e-4016-b6b1-76a916735c3b-kube-api-access-x9z9k\") pod \"control-plane-machine-set-operator-78cbb6b69f-p49xt\" (UID: \"e3882a6f-456e-4016-b6b1-76a916735c3b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p49xt" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206819 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wx2ts\" (UID: \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206851 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-registry-certificates\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206921 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e050cbd0-653b-4d23-8a69-affa52be9608-config-volume\") pod \"collect-profiles-29496795-pwhjd\" (UID: \"e050cbd0-653b-4d23-8a69-affa52be9608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206944 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8602f25-310b-4d02-af41-fe47753bfcfe-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gwf56\" (UID: \"e8602f25-310b-4d02-af41-fe47753bfcfe\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.206994 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp2b9\" (UniqueName: \"kubernetes.io/projected/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-kube-api-access-xp2b9\") pod \"marketplace-operator-79b997595-wx2ts\" (UID: \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.207035 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c5564eec-0f5f-407c-b34f-3d22c5f9921b-profile-collector-cert\") pod \"catalog-operator-68c6474976-mg7r9\" (UID: \"c5564eec-0f5f-407c-b34f-3d22c5f9921b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.207081 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khvnl\" (UniqueName: \"kubernetes.io/projected/06adcbe1-93b1-4edd-b1a5-536f0c54043e-kube-api-access-khvnl\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.207121 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-trusted-ca\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.207137 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7qnq\" (UniqueName: \"kubernetes.io/projected/421cd57e-a15d-457a-a515-d071c7720f85-kube-api-access-z7qnq\") pod \"machine-config-controller-84d6567774-hxlrs\" (UID: \"421cd57e-a15d-457a-a515-d071c7720f85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.207166 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/401dfb2e-119a-487d-915a-b2bfdb275f74-trusted-ca\") pod \"ingress-operator-5b745b69d9-5stwz\" (UID: \"401dfb2e-119a-487d-915a-b2bfdb275f74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.207209 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpsbt\" (UniqueName: \"kubernetes.io/projected/dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0-kube-api-access-wpsbt\") pod \"machine-config-server-kwcbv\" (UID: \"dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0\") " pod="openshift-machine-config-operator/machine-config-server-kwcbv" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.207263 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dfc55944-5aa9-4e66-b049-d109415b0f5e-images\") pod \"machine-config-operator-74547568cd-r9zkn\" (UID: \"dfc55944-5aa9-4e66-b049-d109415b0f5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.207290 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb07f77-973c-4d69-b6f4-d250599cf3a3-config\") pod \"service-ca-operator-777779d784-ww6zz\" (UID: \"5cb07f77-973c-4d69-b6f4-d250599cf3a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.207354 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z5c8\" (UniqueName: \"kubernetes.io/projected/e8602f25-310b-4d02-af41-fe47753bfcfe-kube-api-access-2z5c8\") pod \"olm-operator-6b444d44fb-gwf56\" (UID: \"e8602f25-310b-4d02-af41-fe47753bfcfe\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.207395 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f94df58-70b2-4856-8878-b0fc196d6f6d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cjmd9\" (UID: \"3f94df58-70b2-4856-8878-b0fc196d6f6d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.207464 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f94df58-70b2-4856-8878-b0fc196d6f6d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cjmd9\" (UID: \"3f94df58-70b2-4856-8878-b0fc196d6f6d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.207487 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c5564eec-0f5f-407c-b34f-3d22c5f9921b-srv-cert\") pod \"catalog-operator-68c6474976-mg7r9\" (UID: \"c5564eec-0f5f-407c-b34f-3d22c5f9921b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.207611 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/401dfb2e-119a-487d-915a-b2bfdb275f74-metrics-tls\") pod \"ingress-operator-5b745b69d9-5stwz\" (UID: \"401dfb2e-119a-487d-915a-b2bfdb275f74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.249022 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/401dfb2e-119a-487d-915a-b2bfdb275f74-trusted-ca\") pod \"ingress-operator-5b745b69d9-5stwz\" (UID: \"401dfb2e-119a-487d-915a-b2bfdb275f74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.254874 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-registry-certificates\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.258494 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-trusted-ca\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.259830 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.264297 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:53.764265926 +0000 UTC m=+147.202902687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.264532 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.264849 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkwh4\" (UniqueName: \"kubernetes.io/projected/401dfb2e-119a-487d-915a-b2bfdb275f74-kube-api-access-dkwh4\") pod \"ingress-operator-5b745b69d9-5stwz\" (UID: \"401dfb2e-119a-487d-915a-b2bfdb275f74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.267398 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-registry-tls\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.291156 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/401dfb2e-119a-487d-915a-b2bfdb275f74-metrics-tls\") pod \"ingress-operator-5b745b69d9-5stwz\" (UID: \"401dfb2e-119a-487d-915a-b2bfdb275f74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.309741 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.309968 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts5hw\" (UniqueName: \"kubernetes.io/projected/3846973a-e8a4-432a-b800-99b21bc0a93a-kube-api-access-ts5hw\") pod \"package-server-manager-789f6589d5-ptx46\" (UID: \"3846973a-e8a4-432a-b800-99b21bc0a93a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310006 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhxkl\" (UniqueName: \"kubernetes.io/projected/e050cbd0-653b-4d23-8a69-affa52be9608-kube-api-access-vhxkl\") pod \"collect-profiles-29496795-pwhjd\" (UID: \"e050cbd0-653b-4d23-8a69-affa52be9608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310035 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8602f25-310b-4d02-af41-fe47753bfcfe-srv-cert\") pod \"olm-operator-6b444d44fb-gwf56\" (UID: \"e8602f25-310b-4d02-af41-fe47753bfcfe\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310058 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-registration-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310081 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-service-ca-bundle\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310104 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff6855e6-7cae-432d-a7bf-6d3879ca88c3-metrics-tls\") pod \"dns-default-fj2g8\" (UID: \"ff6855e6-7cae-432d-a7bf-6d3879ca88c3\") " pod="openshift-dns/dns-default-fj2g8" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310127 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-socket-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310149 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0-certs\") pod \"machine-config-server-kwcbv\" (UID: \"dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0\") " pod="openshift-machine-config-operator/machine-config-server-kwcbv" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310192 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a20a3b7-3c40-4816-a5b2-4d756cfbd948-apiservice-cert\") pod \"packageserver-d55dfcdfc-w4dnw\" (UID: \"2a20a3b7-3c40-4816-a5b2-4d756cfbd948\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310214 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2767\" (UniqueName: \"kubernetes.io/projected/dfc55944-5aa9-4e66-b049-d109415b0f5e-kube-api-access-x2767\") pod \"machine-config-operator-74547568cd-r9zkn\" (UID: \"dfc55944-5aa9-4e66-b049-d109415b0f5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310244 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qc7z\" (UniqueName: \"kubernetes.io/projected/f492bda5-ee80-44ea-9c78-42d5bfd959da-kube-api-access-9qc7z\") pod \"multus-admission-controller-857f4d67dd-5lrgb\" (UID: \"f492bda5-ee80-44ea-9c78-42d5bfd959da\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5lrgb" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310267 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0-node-bootstrap-token\") pod \"machine-config-server-kwcbv\" (UID: \"dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0\") " pod="openshift-machine-config-operator/machine-config-server-kwcbv" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310287 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff6855e6-7cae-432d-a7bf-6d3879ca88c3-config-volume\") pod \"dns-default-fj2g8\" (UID: \"ff6855e6-7cae-432d-a7bf-6d3879ca88c3\") " pod="openshift-dns/dns-default-fj2g8" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310321 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/421cd57e-a15d-457a-a515-d071c7720f85-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hxlrs\" (UID: \"421cd57e-a15d-457a-a515-d071c7720f85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310347 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-csi-data-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310368 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f492bda5-ee80-44ea-9c78-42d5bfd959da-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5lrgb\" (UID: \"f492bda5-ee80-44ea-9c78-42d5bfd959da\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5lrgb" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310392 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-metrics-certs\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310425 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xkkz\" (UniqueName: \"kubernetes.io/projected/2a20a3b7-3c40-4816-a5b2-4d756cfbd948-kube-api-access-2xkkz\") pod \"packageserver-d55dfcdfc-w4dnw\" (UID: \"2a20a3b7-3c40-4816-a5b2-4d756cfbd948\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310447 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-plugins-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310467 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a02a92b6-edd8-4ae5-871b-cea79ac68d5a-signing-key\") pod \"service-ca-9c57cc56f-snrcw\" (UID: \"a02a92b6-edd8-4ae5-871b-cea79ac68d5a\") " pod="openshift-service-ca/service-ca-9c57cc56f-snrcw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310487 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2a20a3b7-3c40-4816-a5b2-4d756cfbd948-tmpfs\") pod \"packageserver-d55dfcdfc-w4dnw\" (UID: \"2a20a3b7-3c40-4816-a5b2-4d756cfbd948\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310533 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfc55944-5aa9-4e66-b049-d109415b0f5e-proxy-tls\") pod \"machine-config-operator-74547568cd-r9zkn\" (UID: \"dfc55944-5aa9-4e66-b049-d109415b0f5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310556 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d97l7\" (UniqueName: \"kubernetes.io/projected/c5564eec-0f5f-407c-b34f-3d22c5f9921b-kube-api-access-d97l7\") pod \"catalog-operator-68c6474976-mg7r9\" (UID: \"c5564eec-0f5f-407c-b34f-3d22c5f9921b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310587 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/489f08b5-b5d7-45a2-98e7-0c26139ed1d9-cert\") pod \"ingress-canary-wgrx5\" (UID: \"489f08b5-b5d7-45a2-98e7-0c26139ed1d9\") " pod="openshift-ingress-canary/ingress-canary-wgrx5" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310609 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb07f77-973c-4d69-b6f4-d250599cf3a3-serving-cert\") pod \"service-ca-operator-777779d784-ww6zz\" (UID: \"5cb07f77-973c-4d69-b6f4-d250599cf3a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310633 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3846973a-e8a4-432a-b800-99b21bc0a93a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ptx46\" (UID: \"3846973a-e8a4-432a-b800-99b21bc0a93a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310656 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e050cbd0-653b-4d23-8a69-affa52be9608-secret-volume\") pod \"collect-profiles-29496795-pwhjd\" (UID: \"e050cbd0-653b-4d23-8a69-affa52be9608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310683 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h9wf\" (UniqueName: \"kubernetes.io/projected/a02a92b6-edd8-4ae5-871b-cea79ac68d5a-kube-api-access-8h9wf\") pod \"service-ca-9c57cc56f-snrcw\" (UID: \"a02a92b6-edd8-4ae5-871b-cea79ac68d5a\") " pod="openshift-service-ca/service-ca-9c57cc56f-snrcw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310727 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wx2ts\" (UID: \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310752 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htcs9\" (UniqueName: \"kubernetes.io/projected/ff6855e6-7cae-432d-a7bf-6d3879ca88c3-kube-api-access-htcs9\") pod \"dns-default-fj2g8\" (UID: \"ff6855e6-7cae-432d-a7bf-6d3879ca88c3\") " pod="openshift-dns/dns-default-fj2g8" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310776 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spb22\" (UniqueName: \"kubernetes.io/projected/489f08b5-b5d7-45a2-98e7-0c26139ed1d9-kube-api-access-spb22\") pod \"ingress-canary-wgrx5\" (UID: \"489f08b5-b5d7-45a2-98e7-0c26139ed1d9\") " pod="openshift-ingress-canary/ingress-canary-wgrx5" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310815 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-mountpoint-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310843 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3882a6f-456e-4016-b6b1-76a916735c3b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-p49xt\" (UID: \"e3882a6f-456e-4016-b6b1-76a916735c3b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p49xt" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310866 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a02a92b6-edd8-4ae5-871b-cea79ac68d5a-signing-cabundle\") pod \"service-ca-9c57cc56f-snrcw\" (UID: \"a02a92b6-edd8-4ae5-871b-cea79ac68d5a\") " pod="openshift-service-ca/service-ca-9c57cc56f-snrcw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310890 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2j2x\" (UniqueName: \"kubernetes.io/projected/5cb07f77-973c-4d69-b6f4-d250599cf3a3-kube-api-access-b2j2x\") pod \"service-ca-operator-777779d784-ww6zz\" (UID: \"5cb07f77-973c-4d69-b6f4-d250599cf3a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310914 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4frr\" (UniqueName: \"kubernetes.io/projected/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-kube-api-access-r4frr\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310938 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f94df58-70b2-4856-8878-b0fc196d6f6d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cjmd9\" (UID: \"3f94df58-70b2-4856-8878-b0fc196d6f6d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310960 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dfc55944-5aa9-4e66-b049-d109415b0f5e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-r9zkn\" (UID: \"dfc55944-5aa9-4e66-b049-d109415b0f5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.310981 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9z9k\" (UniqueName: \"kubernetes.io/projected/e3882a6f-456e-4016-b6b1-76a916735c3b-kube-api-access-x9z9k\") pod \"control-plane-machine-set-operator-78cbb6b69f-p49xt\" (UID: \"e3882a6f-456e-4016-b6b1-76a916735c3b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p49xt" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311002 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wx2ts\" (UID: \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311028 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e050cbd0-653b-4d23-8a69-affa52be9608-config-volume\") pod \"collect-profiles-29496795-pwhjd\" (UID: \"e050cbd0-653b-4d23-8a69-affa52be9608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311051 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8602f25-310b-4d02-af41-fe47753bfcfe-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gwf56\" (UID: \"e8602f25-310b-4d02-af41-fe47753bfcfe\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311083 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp2b9\" (UniqueName: \"kubernetes.io/projected/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-kube-api-access-xp2b9\") pod \"marketplace-operator-79b997595-wx2ts\" (UID: \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311105 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c5564eec-0f5f-407c-b34f-3d22c5f9921b-profile-collector-cert\") pod \"catalog-operator-68c6474976-mg7r9\" (UID: \"c5564eec-0f5f-407c-b34f-3d22c5f9921b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311128 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khvnl\" (UniqueName: \"kubernetes.io/projected/06adcbe1-93b1-4edd-b1a5-536f0c54043e-kube-api-access-khvnl\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311155 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7qnq\" (UniqueName: \"kubernetes.io/projected/421cd57e-a15d-457a-a515-d071c7720f85-kube-api-access-z7qnq\") pod \"machine-config-controller-84d6567774-hxlrs\" (UID: \"421cd57e-a15d-457a-a515-d071c7720f85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311179 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpsbt\" (UniqueName: \"kubernetes.io/projected/dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0-kube-api-access-wpsbt\") pod \"machine-config-server-kwcbv\" (UID: \"dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0\") " pod="openshift-machine-config-operator/machine-config-server-kwcbv" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311210 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dfc55944-5aa9-4e66-b049-d109415b0f5e-images\") pod \"machine-config-operator-74547568cd-r9zkn\" (UID: \"dfc55944-5aa9-4e66-b049-d109415b0f5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311233 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb07f77-973c-4d69-b6f4-d250599cf3a3-config\") pod \"service-ca-operator-777779d784-ww6zz\" (UID: \"5cb07f77-973c-4d69-b6f4-d250599cf3a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311254 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f94df58-70b2-4856-8878-b0fc196d6f6d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cjmd9\" (UID: \"3f94df58-70b2-4856-8878-b0fc196d6f6d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311274 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z5c8\" (UniqueName: \"kubernetes.io/projected/e8602f25-310b-4d02-af41-fe47753bfcfe-kube-api-access-2z5c8\") pod \"olm-operator-6b444d44fb-gwf56\" (UID: \"e8602f25-310b-4d02-af41-fe47753bfcfe\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311308 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f94df58-70b2-4856-8878-b0fc196d6f6d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cjmd9\" (UID: \"3f94df58-70b2-4856-8878-b0fc196d6f6d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311329 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c5564eec-0f5f-407c-b34f-3d22c5f9921b-srv-cert\") pod \"catalog-operator-68c6474976-mg7r9\" (UID: \"c5564eec-0f5f-407c-b34f-3d22c5f9921b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311358 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a20a3b7-3c40-4816-a5b2-4d756cfbd948-webhook-cert\") pod \"packageserver-d55dfcdfc-w4dnw\" (UID: \"2a20a3b7-3c40-4816-a5b2-4d756cfbd948\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311387 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/421cd57e-a15d-457a-a515-d071c7720f85-proxy-tls\") pod \"machine-config-controller-84d6567774-hxlrs\" (UID: \"421cd57e-a15d-457a-a515-d071c7720f85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311411 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-default-certificate\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.311438 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-stats-auth\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.311937 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:53.811916898 +0000 UTC m=+147.250553659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.312865 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-csi-data-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.313110 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-registration-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.314447 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-service-ca-bundle\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.318686 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8602f25-310b-4d02-af41-fe47753bfcfe-srv-cert\") pod \"olm-operator-6b444d44fb-gwf56\" (UID: \"e8602f25-310b-4d02-af41-fe47753bfcfe\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.321270 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-socket-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.323165 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-bound-sa-token\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.325944 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-mountpoint-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.326169 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/06adcbe1-93b1-4edd-b1a5-536f0c54043e-plugins-dir\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.326559 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dfc55944-5aa9-4e66-b049-d109415b0f5e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-r9zkn\" (UID: \"dfc55944-5aa9-4e66-b049-d109415b0f5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.327629 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e050cbd0-653b-4d23-8a69-affa52be9608-config-volume\") pod \"collect-profiles-29496795-pwhjd\" (UID: \"e050cbd0-653b-4d23-8a69-affa52be9608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.328105 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wx2ts\" (UID: \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.338198 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a02a92b6-edd8-4ae5-871b-cea79ac68d5a-signing-cabundle\") pod \"service-ca-9c57cc56f-snrcw\" (UID: \"a02a92b6-edd8-4ae5-871b-cea79ac68d5a\") " pod="openshift-service-ca/service-ca-9c57cc56f-snrcw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.340317 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-stats-auth\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.341719 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff6855e6-7cae-432d-a7bf-6d3879ca88c3-metrics-tls\") pod \"dns-default-fj2g8\" (UID: \"ff6855e6-7cae-432d-a7bf-6d3879ca88c3\") " pod="openshift-dns/dns-default-fj2g8" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.341759 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wx2ts\" (UID: \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.345175 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c5564eec-0f5f-407c-b34f-3d22c5f9921b-profile-collector-cert\") pod \"catalog-operator-68c6474976-mg7r9\" (UID: \"c5564eec-0f5f-407c-b34f-3d22c5f9921b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.345316 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0-node-bootstrap-token\") pod \"machine-config-server-kwcbv\" (UID: \"dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0\") " pod="openshift-machine-config-operator/machine-config-server-kwcbv" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.346337 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr5b6\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-kube-api-access-jr5b6\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.347485 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f492bda5-ee80-44ea-9c78-42d5bfd959da-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5lrgb\" (UID: \"f492bda5-ee80-44ea-9c78-42d5bfd959da\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5lrgb" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.348217 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0-certs\") pod \"machine-config-server-kwcbv\" (UID: \"dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0\") " pod="openshift-machine-config-operator/machine-config-server-kwcbv" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.348552 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfc55944-5aa9-4e66-b049-d109415b0f5e-proxy-tls\") pod \"machine-config-operator-74547568cd-r9zkn\" (UID: \"dfc55944-5aa9-4e66-b049-d109415b0f5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.348764 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8602f25-310b-4d02-af41-fe47753bfcfe-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gwf56\" (UID: \"e8602f25-310b-4d02-af41-fe47753bfcfe\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.349292 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dfc55944-5aa9-4e66-b049-d109415b0f5e-images\") pod \"machine-config-operator-74547568cd-r9zkn\" (UID: \"dfc55944-5aa9-4e66-b049-d109415b0f5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.349488 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-metrics-certs\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.368337 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb07f77-973c-4d69-b6f4-d250599cf3a3-config\") pod \"service-ca-operator-777779d784-ww6zz\" (UID: \"5cb07f77-973c-4d69-b6f4-d250599cf3a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.369479 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3882a6f-456e-4016-b6b1-76a916735c3b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-p49xt\" (UID: \"e3882a6f-456e-4016-b6b1-76a916735c3b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p49xt" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.372482 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a02a92b6-edd8-4ae5-871b-cea79ac68d5a-signing-key\") pod \"service-ca-9c57cc56f-snrcw\" (UID: \"a02a92b6-edd8-4ae5-871b-cea79ac68d5a\") " pod="openshift-service-ca/service-ca-9c57cc56f-snrcw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.377746 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/421cd57e-a15d-457a-a515-d071c7720f85-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hxlrs\" (UID: \"421cd57e-a15d-457a-a515-d071c7720f85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.380399 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a20a3b7-3c40-4816-a5b2-4d756cfbd948-apiservice-cert\") pod \"packageserver-d55dfcdfc-w4dnw\" (UID: \"2a20a3b7-3c40-4816-a5b2-4d756cfbd948\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.380870 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2a20a3b7-3c40-4816-a5b2-4d756cfbd948-tmpfs\") pod \"packageserver-d55dfcdfc-w4dnw\" (UID: \"2a20a3b7-3c40-4816-a5b2-4d756cfbd948\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.387342 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff6855e6-7cae-432d-a7bf-6d3879ca88c3-config-volume\") pod \"dns-default-fj2g8\" (UID: \"ff6855e6-7cae-432d-a7bf-6d3879ca88c3\") " pod="openshift-dns/dns-default-fj2g8" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.387366 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb07f77-973c-4d69-b6f4-d250599cf3a3-serving-cert\") pod \"service-ca-operator-777779d784-ww6zz\" (UID: \"5cb07f77-973c-4d69-b6f4-d250599cf3a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.390251 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f94df58-70b2-4856-8878-b0fc196d6f6d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cjmd9\" (UID: \"3f94df58-70b2-4856-8878-b0fc196d6f6d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.390933 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a20a3b7-3c40-4816-a5b2-4d756cfbd948-webhook-cert\") pod \"packageserver-d55dfcdfc-w4dnw\" (UID: \"2a20a3b7-3c40-4816-a5b2-4d756cfbd948\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.393281 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f94df58-70b2-4856-8878-b0fc196d6f6d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cjmd9\" (UID: \"3f94df58-70b2-4856-8878-b0fc196d6f6d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.393851 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/489f08b5-b5d7-45a2-98e7-0c26139ed1d9-cert\") pod \"ingress-canary-wgrx5\" (UID: \"489f08b5-b5d7-45a2-98e7-0c26139ed1d9\") " pod="openshift-ingress-canary/ingress-canary-wgrx5" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.394320 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e050cbd0-653b-4d23-8a69-affa52be9608-secret-volume\") pod \"collect-profiles-29496795-pwhjd\" (UID: \"e050cbd0-653b-4d23-8a69-affa52be9608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.396018 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-default-certificate\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.397114 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c5564eec-0f5f-407c-b34f-3d22c5f9921b-srv-cert\") pod \"catalog-operator-68c6474976-mg7r9\" (UID: \"c5564eec-0f5f-407c-b34f-3d22c5f9921b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.399351 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/401dfb2e-119a-487d-915a-b2bfdb275f74-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5stwz\" (UID: \"401dfb2e-119a-487d-915a-b2bfdb275f74\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.402014 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/421cd57e-a15d-457a-a515-d071c7720f85-proxy-tls\") pod \"machine-config-controller-84d6567774-hxlrs\" (UID: \"421cd57e-a15d-457a-a515-d071c7720f85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.405264 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h9wf\" (UniqueName: \"kubernetes.io/projected/a02a92b6-edd8-4ae5-871b-cea79ac68d5a-kube-api-access-8h9wf\") pod \"service-ca-9c57cc56f-snrcw\" (UID: \"a02a92b6-edd8-4ae5-871b-cea79ac68d5a\") " pod="openshift-service-ca/service-ca-9c57cc56f-snrcw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.413337 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.413791 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:53.913780301 +0000 UTC m=+147.352417062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.422505 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts5hw\" (UniqueName: \"kubernetes.io/projected/3846973a-e8a4-432a-b800-99b21bc0a93a-kube-api-access-ts5hw\") pod \"package-server-manager-789f6589d5-ptx46\" (UID: \"3846973a-e8a4-432a-b800-99b21bc0a93a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.423646 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tfjll"] Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.429304 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3846973a-e8a4-432a-b800-99b21bc0a93a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ptx46\" (UID: \"3846973a-e8a4-432a-b800-99b21bc0a93a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.455635 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2j2x\" (UniqueName: \"kubernetes.io/projected/5cb07f77-973c-4d69-b6f4-d250599cf3a3-kube-api-access-b2j2x\") pod \"service-ca-operator-777779d784-ww6zz\" (UID: \"5cb07f77-973c-4d69-b6f4-d250599cf3a3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.468531 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhxkl\" (UniqueName: \"kubernetes.io/projected/e050cbd0-653b-4d23-8a69-affa52be9608-kube-api-access-vhxkl\") pod \"collect-profiles-29496795-pwhjd\" (UID: \"e050cbd0-653b-4d23-8a69-affa52be9608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.473604 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.475529 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htcs9\" (UniqueName: \"kubernetes.io/projected/ff6855e6-7cae-432d-a7bf-6d3879ca88c3-kube-api-access-htcs9\") pod \"dns-default-fj2g8\" (UID: \"ff6855e6-7cae-432d-a7bf-6d3879ca88c3\") " pod="openshift-dns/dns-default-fj2g8" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.514137 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spb22\" (UniqueName: \"kubernetes.io/projected/489f08b5-b5d7-45a2-98e7-0c26139ed1d9-kube-api-access-spb22\") pod \"ingress-canary-wgrx5\" (UID: \"489f08b5-b5d7-45a2-98e7-0c26139ed1d9\") " pod="openshift-ingress-canary/ingress-canary-wgrx5" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.514488 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.514898 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:54.014886505 +0000 UTC m=+147.453523266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.516794 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4frr\" (UniqueName: \"kubernetes.io/projected/a961d0f9-f1b6-4a3b-8c49-f03f1b797632-kube-api-access-r4frr\") pod \"router-default-5444994796-f65q2\" (UID: \"a961d0f9-f1b6-4a3b-8c49-f03f1b797632\") " pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.530493 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.543813 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f94df58-70b2-4856-8878-b0fc196d6f6d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cjmd9\" (UID: \"3f94df58-70b2-4856-8878-b0fc196d6f6d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.552657 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9z9k\" (UniqueName: \"kubernetes.io/projected/e3882a6f-456e-4016-b6b1-76a916735c3b-kube-api-access-x9z9k\") pod \"control-plane-machine-set-operator-78cbb6b69f-p49xt\" (UID: \"e3882a6f-456e-4016-b6b1-76a916735c3b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p49xt" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.573400 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-scclv"] Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.581941 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp2b9\" (UniqueName: \"kubernetes.io/projected/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-kube-api-access-xp2b9\") pod \"marketplace-operator-79b997595-wx2ts\" (UID: \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.591420 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8"] Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.614275 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xkkz\" (UniqueName: \"kubernetes.io/projected/2a20a3b7-3c40-4816-a5b2-4d756cfbd948-kube-api-access-2xkkz\") pod \"packageserver-d55dfcdfc-w4dnw\" (UID: \"2a20a3b7-3c40-4816-a5b2-4d756cfbd948\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.614900 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p49xt" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.615591 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djd4k\" (UniqueName: \"kubernetes.io/projected/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-kube-api-access-djd4k\") pod \"machine-api-operator-5694c8668f-85rbp\" (UID: \"6b3718ea-66f6-4f01-97c5-94c7c844e1a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.615683 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.616012 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:54.116001629 +0000 UTC m=+147.554638380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.617774 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2767\" (UniqueName: \"kubernetes.io/projected/dfc55944-5aa9-4e66-b049-d109415b0f5e-kube-api-access-x2767\") pod \"machine-config-operator-74547568cd-r9zkn\" (UID: \"dfc55944-5aa9-4e66-b049-d109415b0f5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" Jan 30 21:16:53 crc kubenswrapper[4914]: W0130 21:16:53.619004 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod408e8313_53b0_4848_9d70_c99eaa88d122.slice/crio-e56ff74fad308ada2346cbae905408f287b35ee6b13bc33814576503ebc0d989 WatchSource:0}: Error finding container e56ff74fad308ada2346cbae905408f287b35ee6b13bc33814576503ebc0d989: Status 404 returned error can't find the container with id e56ff74fad308ada2346cbae905408f287b35ee6b13bc33814576503ebc0d989 Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.626396 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djd4k\" (UniqueName: \"kubernetes.io/projected/6b3718ea-66f6-4f01-97c5-94c7c844e1a0-kube-api-access-djd4k\") pod \"machine-api-operator-5694c8668f-85rbp\" (UID: \"6b3718ea-66f6-4f01-97c5-94c7c844e1a0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.631393 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc"] Jan 30 21:16:53 crc kubenswrapper[4914]: W0130 21:16:53.633663 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77a21683_69d1_4459_aa95_cf4f0d33ec19.slice/crio-1a15303484b5e02914a3a24ffb174692567f7691ff5cc6f589dc0c95044b11fa WatchSource:0}: Error finding container 1a15303484b5e02914a3a24ffb174692567f7691ff5cc6f589dc0c95044b11fa: Status 404 returned error can't find the container with id 1a15303484b5e02914a3a24ffb174692567f7691ff5cc6f589dc0c95044b11fa Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.634346 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.640677 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.641990 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khvnl\" (UniqueName: \"kubernetes.io/projected/06adcbe1-93b1-4edd-b1a5-536f0c54043e-kube-api-access-khvnl\") pod \"csi-hostpathplugin-fhksq\" (UID: \"06adcbe1-93b1-4edd-b1a5-536f0c54043e\") " pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.648259 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.654633 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7qnq\" (UniqueName: \"kubernetes.io/projected/421cd57e-a15d-457a-a515-d071c7720f85-kube-api-access-z7qnq\") pod \"machine-config-controller-84d6567774-hxlrs\" (UID: \"421cd57e-a15d-457a-a515-d071c7720f85\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.657056 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-snrcw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.661116 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.667428 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d97l7\" (UniqueName: \"kubernetes.io/projected/c5564eec-0f5f-407c-b34f-3d22c5f9921b-kube-api-access-d97l7\") pod \"catalog-operator-68c6474976-mg7r9\" (UID: \"c5564eec-0f5f-407c-b34f-3d22c5f9921b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.674392 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.678634 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.688326 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpsbt\" (UniqueName: \"kubernetes.io/projected/dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0-kube-api-access-wpsbt\") pod \"machine-config-server-kwcbv\" (UID: \"dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0\") " pod="openshift-machine-config-operator/machine-config-server-kwcbv" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.689836 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mx28l"] Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.691959 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" event={"ID":"2c4ec43d-4942-442e-8a64-78e724700938","Type":"ContainerStarted","Data":"8e9c6be66451b3b22b312ecbaee074259ce392b82129eb5667ea2b951e828f45"} Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.691985 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" event={"ID":"2c4ec43d-4942-442e-8a64-78e724700938","Type":"ContainerStarted","Data":"2999eeaf0585285e9dfb4cc616288083c3ff2ff3d393f2768afdb58ab55ab7f9"} Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.699810 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-fhksq" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.708251 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kwcbv" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.713575 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" event={"ID":"bdabe348-b2e8-4c4c-a3d8-c5827a94e615","Type":"ContainerStarted","Data":"4e7af9f6df4f65e99f7e19b33de0ce9cfbc42ab6ade0c8a10bcd79324a197a97"} Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.714246 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.716296 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wgrx5" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.729737 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.729807 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fj2g8" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.731603 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x7vxp"] Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.731857 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:54.231838109 +0000 UTC m=+147.670474870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.731960 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.731996 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz4sj\" (UniqueName: \"kubernetes.io/projected/3efefb06-dccd-4432-8a91-9ac951803c21-kube-api-access-jz4sj\") pod \"cluster-samples-operator-665b6dd947-648dg\" (UID: \"3efefb06-dccd-4432-8a91-9ac951803c21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg" Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.732512 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:54.232496075 +0000 UTC m=+147.671132836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.740567 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz4sj\" (UniqueName: \"kubernetes.io/projected/3efefb06-dccd-4432-8a91-9ac951803c21-kube-api-access-jz4sj\") pod \"cluster-samples-operator-665b6dd947-648dg\" (UID: \"3efefb06-dccd-4432-8a91-9ac951803c21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.742047 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z5c8\" (UniqueName: \"kubernetes.io/projected/e8602f25-310b-4d02-af41-fe47753bfcfe-kube-api-access-2z5c8\") pod \"olm-operator-6b444d44fb-gwf56\" (UID: \"e8602f25-310b-4d02-af41-fe47753bfcfe\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.760119 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b"] Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.763961 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qc7z\" (UniqueName: \"kubernetes.io/projected/f492bda5-ee80-44ea-9c78-42d5bfd959da-kube-api-access-9qc7z\") pod \"multus-admission-controller-857f4d67dd-5lrgb\" (UID: \"f492bda5-ee80-44ea-9c78-42d5bfd959da\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5lrgb" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.775258 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tfjll" event={"ID":"a86e9a60-2314-425d-acae-d6611ca8b181","Type":"ContainerStarted","Data":"b5ee7dc8282bee3b670b39045e794abe5d917323fa3e0152de5e28a9f38fc24f"} Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.780922 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" event={"ID":"8b850795-7fca-417d-9e31-c319e45e2594","Type":"ContainerStarted","Data":"d4febac7b7cd986102d7fcdc11f8e5c3d551ff11b7793ef79d1be2c690161d85"} Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.791985 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" event={"ID":"56bea571-93fc-4c52-aeef-39c979dfd095","Type":"ContainerStarted","Data":"9377f83188ac263e2b23e8d0d5d4041f0f4bac031b2210ec341c76d2d37a798a"} Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.792028 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" event={"ID":"56bea571-93fc-4c52-aeef-39c979dfd095","Type":"ContainerStarted","Data":"6527c19ed2a2df2c1870b0419d48cff7d1789097f33c3fd42638abd08cf78cda"} Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.804350 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f65q2" event={"ID":"a961d0f9-f1b6-4a3b-8c49-f03f1b797632","Type":"ContainerStarted","Data":"0a8ff90c440be0fc10839b2d0ccb9f1882e14a5e6c0622fd1fc4f2a4d6dccc23"} Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.834760 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.835607 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:54.335592477 +0000 UTC m=+147.774229238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.838597 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.850700 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.853744 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.854821 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.871464 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.872128 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.885784 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l546h"] Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.885996 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.886011 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t"] Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.911772 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt"] Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.920178 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5lrgb" Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.940011 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:53 crc kubenswrapper[4914]: E0130 21:16:53.963970 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:54.46395554 +0000 UTC m=+147.902592301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:53 crc kubenswrapper[4914]: I0130 21:16:53.966012 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2cd62"] Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.003136 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-md7dg"] Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.008994 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz"] Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.015692 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5tbtc"] Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.041233 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:54 crc kubenswrapper[4914]: E0130 21:16:54.041397 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:54.541365761 +0000 UTC m=+147.980002522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.041526 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:54 crc kubenswrapper[4914]: E0130 21:16:54.041817 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:54.541805312 +0000 UTC m=+147.980442073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.059426 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p49xt"] Jan 30 21:16:54 crc kubenswrapper[4914]: W0130 21:16:54.124196 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod401dfb2e_119a_487d_915a_b2bfdb275f74.slice/crio-e7b0a4bdbcfbfa655bd9549e647d47938c93953a0f1a33c01757f663f9998be8 WatchSource:0}: Error finding container e7b0a4bdbcfbfa655bd9549e647d47938c93953a0f1a33c01757f663f9998be8: Status 404 returned error can't find the container with id e7b0a4bdbcfbfa655bd9549e647d47938c93953a0f1a33c01757f663f9998be8 Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.143771 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:54 crc kubenswrapper[4914]: E0130 21:16:54.144161 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:54.644146446 +0000 UTC m=+148.082783207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:54 crc kubenswrapper[4914]: W0130 21:16:54.183264 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddadb7c6b_fdea_413b_a5ef_7d7bc19a02c0.slice/crio-c3a43046316c7727c2ce480af0b4161792bc181d780648d3fe5c52cdebf36000 WatchSource:0}: Error finding container c3a43046316c7727c2ce480af0b4161792bc181d780648d3fe5c52cdebf36000: Status 404 returned error can't find the container with id c3a43046316c7727c2ce480af0b4161792bc181d780648d3fe5c52cdebf36000 Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.234584 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ws7lj" podStartSLOduration=126.234567802 podStartE2EDuration="2m6.234567802s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:54.233092856 +0000 UTC m=+147.671729617" watchObservedRunningTime="2026-01-30 21:16:54.234567802 +0000 UTC m=+147.673204563" Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.234842 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" podStartSLOduration=126.234839228 podStartE2EDuration="2m6.234839228s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:54.201148404 +0000 UTC m=+147.639785165" watchObservedRunningTime="2026-01-30 21:16:54.234839228 +0000 UTC m=+147.673475989" Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.245080 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:54 crc kubenswrapper[4914]: E0130 21:16:54.245412 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:54.745402154 +0000 UTC m=+148.184038915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.284870 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw"] Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.364169 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:54 crc kubenswrapper[4914]: E0130 21:16:54.364244 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:54.864227276 +0000 UTC m=+148.302864037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.364599 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:54 crc kubenswrapper[4914]: E0130 21:16:54.364889 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:54.864880292 +0000 UTC m=+148.303517053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.380424 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wx2ts"] Jan 30 21:16:54 crc kubenswrapper[4914]: W0130 21:16:54.441243 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a20a3b7_3c40_4816_a5b2_4d756cfbd948.slice/crio-bab561db0bd25d98dc2583aa6ca4a8c6ba4ec4c4fb843dd1ecdbe828adb75dc6 WatchSource:0}: Error finding container bab561db0bd25d98dc2583aa6ca4a8c6ba4ec4c4fb843dd1ecdbe828adb75dc6: Status 404 returned error can't find the container with id bab561db0bd25d98dc2583aa6ca4a8c6ba4ec4c4fb843dd1ecdbe828adb75dc6 Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.461590 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-79tl2" podStartSLOduration=126.461526678 podStartE2EDuration="2m6.461526678s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:54.460727609 +0000 UTC m=+147.899364370" watchObservedRunningTime="2026-01-30 21:16:54.461526678 +0000 UTC m=+147.900163439" Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.465295 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:54 crc kubenswrapper[4914]: E0130 21:16:54.465665 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:54.965651378 +0000 UTC m=+148.404288139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.546058 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" podStartSLOduration=126.546044721 podStartE2EDuration="2m6.546044721s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:54.545808655 +0000 UTC m=+147.984445426" watchObservedRunningTime="2026-01-30 21:16:54.546044721 +0000 UTC m=+147.984681472" Jan 30 21:16:54 crc kubenswrapper[4914]: W0130 21:16:54.560204 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a2f6adb_e5cc_43f7_974d_11bae45ddbcc.slice/crio-846f0b8ae04475f141760d2b2987cebb8aa92656a57a89a57c7eacf1b8dd04b6 WatchSource:0}: Error finding container 846f0b8ae04475f141760d2b2987cebb8aa92656a57a89a57c7eacf1b8dd04b6: Status 404 returned error can't find the container with id 846f0b8ae04475f141760d2b2987cebb8aa92656a57a89a57c7eacf1b8dd04b6 Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.566637 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:54 crc kubenswrapper[4914]: E0130 21:16:54.567030 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.067017998 +0000 UTC m=+148.505654759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.667231 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:54 crc kubenswrapper[4914]: E0130 21:16:54.667493 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.167479216 +0000 UTC m=+148.606115977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.768161 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:54 crc kubenswrapper[4914]: E0130 21:16:54.768524 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.268513009 +0000 UTC m=+148.707149770 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.853922 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" event={"ID":"2a20a3b7-3c40-4816-a5b2-4d756cfbd948","Type":"ContainerStarted","Data":"bab561db0bd25d98dc2583aa6ca4a8c6ba4ec4c4fb843dd1ecdbe828adb75dc6"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.874431 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:54 crc kubenswrapper[4914]: E0130 21:16:54.875527 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.375511535 +0000 UTC m=+148.814148296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.877837 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" event={"ID":"feb3e51e-2635-4659-bdb6-c3e72ed63b41","Type":"ContainerStarted","Data":"005c11ac1d15f75745250068feb5a09ca2b30ab9a0417c7a744213db316996af"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.878665 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt" event={"ID":"ca8b735a-2235-4ff8-920e-f40483600c05","Type":"ContainerStarted","Data":"1256bb05469cf8e2381f633f7366d9254c3eae399780a18307da41361897ed7e"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.879482 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-md7dg" event={"ID":"564bf8fe-2efd-4e47-bbf5-f0dea6402178","Type":"ContainerStarted","Data":"561d5201dcb3ce7c3a7345e69064cffc0bb568b96f72ad4502776965435a1e8e"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.891954 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" event={"ID":"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc","Type":"ContainerStarted","Data":"846f0b8ae04475f141760d2b2987cebb8aa92656a57a89a57c7eacf1b8dd04b6"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.922518 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x7vxp" event={"ID":"54963e78-1698-4be2-925c-be7dc08c34a6","Type":"ContainerStarted","Data":"4754bc511131c7492673ba0827da0b86eb76cc312f662f2303ef3f3a7827f716"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.922550 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x7vxp" event={"ID":"54963e78-1698-4be2-925c-be7dc08c34a6","Type":"ContainerStarted","Data":"3b5b31baf3eea3a0b6a396dc7d54740eb9f5ff9add4b7d6a299193db0e408001"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.934504 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-scclv" event={"ID":"77a21683-69d1-4459-aa95-cf4f0d33ec19","Type":"ContainerStarted","Data":"1a15303484b5e02914a3a24ffb174692567f7691ff5cc6f589dc0c95044b11fa"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.957617 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" podStartSLOduration=126.957598 podStartE2EDuration="2m6.957598s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:54.947144657 +0000 UTC m=+148.385781418" watchObservedRunningTime="2026-01-30 21:16:54.957598 +0000 UTC m=+148.396234761" Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.959773 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mx28l" event={"ID":"8a73fa67-f017-4a93-a8f5-6d2f753dcb37","Type":"ContainerStarted","Data":"dc887575c5c5d0b07c7f7ad2d1fdb40262d058085fc1e49c72d34df8e48b9731"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.960588 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8" event={"ID":"408e8313-53b0-4848-9d70-c99eaa88d122","Type":"ContainerStarted","Data":"e56ff74fad308ada2346cbae905408f287b35ee6b13bc33814576503ebc0d989"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.961452 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5tbtc" event={"ID":"d1e4319f-4808-4c3b-8dfb-4002f1bd7885","Type":"ContainerStarted","Data":"bea16f5a4d784ab643fd9593e5f6b52ca6d4f7148cf98ff564ae01fa7c5c5cf5"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.962259 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" event={"ID":"95726c08-64b5-4c14-9eed-81815ea8efcb","Type":"ContainerStarted","Data":"8556f6d2ef22948ae48991dcc64065e6b07c5c735d5c2ef27dcb33d6c6909e64"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.982093 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:54 crc kubenswrapper[4914]: E0130 21:16:54.982644 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.482630525 +0000 UTC m=+148.921267286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.989164 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc" event={"ID":"427fbe21-4cfb-4e3f-868f-6b40ab37f9f6","Type":"ContainerStarted","Data":"ad0da7dc2baf062a7483c8228d56105b91ec4016a9e28ca19fdeb35a2adf0e29"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.990666 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f65q2" event={"ID":"a961d0f9-f1b6-4a3b-8c49-f03f1b797632","Type":"ContainerStarted","Data":"1ea10a354b8308e2fb2e05324a761d2289b1b412142a9f37fdb9cbbdec41a075"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.997465 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" event={"ID":"2c4ec43d-4942-442e-8a64-78e724700938","Type":"ContainerStarted","Data":"c28cb78745739f99f60e7724c48ac7074f73ea401ca3d17cb8f42a645e6e1711"} Jan 30 21:16:54 crc kubenswrapper[4914]: I0130 21:16:54.999436 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kwcbv" event={"ID":"dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0","Type":"ContainerStarted","Data":"c3a43046316c7727c2ce480af0b4161792bc181d780648d3fe5c52cdebf36000"} Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.000021 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b" event={"ID":"c7e2d746-634f-4c1a-9d70-9a61db901650","Type":"ContainerStarted","Data":"f537ebc9876f23c390ee4390986353707b8d1ef750dd5f685fff591a47ce3992"} Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.000603 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" event={"ID":"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66","Type":"ContainerStarted","Data":"7fd27812a3be83870e8661b1959d503dafe4f0afbba0071f8c52204de7da8d82"} Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.001764 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p49xt" event={"ID":"e3882a6f-456e-4016-b6b1-76a916735c3b","Type":"ContainerStarted","Data":"435ef17f501995a1bc707e7aaa7ae6f064f9197c4105efe0d8e0de4f69170cf6"} Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.002409 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" event={"ID":"401dfb2e-119a-487d-915a-b2bfdb275f74","Type":"ContainerStarted","Data":"e7b0a4bdbcfbfa655bd9549e647d47938c93953a0f1a33c01757f663f9998be8"} Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.012654 4914 generic.go:334] "Generic (PLEG): container finished" podID="a86e9a60-2314-425d-acae-d6611ca8b181" containerID="08d766cf3db72ad1e5c84db9d3707a96d31675cb469fb33e62f8828a34a31700" exitCode=0 Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.015888 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tfjll" event={"ID":"a86e9a60-2314-425d-acae-d6611ca8b181","Type":"ContainerDied","Data":"08d766cf3db72ad1e5c84db9d3707a96d31675cb469fb33e62f8828a34a31700"} Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.047909 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tspt2" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.083437 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:55 crc kubenswrapper[4914]: E0130 21:16:55.084266 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.584251291 +0000 UTC m=+149.022888052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.084887 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:55 crc kubenswrapper[4914]: E0130 21:16:55.085214 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.585195774 +0000 UTC m=+149.023832535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.195731 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:55 crc kubenswrapper[4914]: E0130 21:16:55.197396 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.697359854 +0000 UTC m=+149.135996615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.301130 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:55 crc kubenswrapper[4914]: E0130 21:16:55.302307 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.802295201 +0000 UTC m=+149.240931962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.329295 4914 csr.go:261] certificate signing request csr-h6kwq is approved, waiting to be issued Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.336944 4914 csr.go:257] certificate signing request csr-h6kwq is issued Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.378815 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" podStartSLOduration=127.3787979 podStartE2EDuration="2m7.3787979s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:55.377611642 +0000 UTC m=+148.816248403" watchObservedRunningTime="2026-01-30 21:16:55.3787979 +0000 UTC m=+148.817434651" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.405500 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:55 crc kubenswrapper[4914]: E0130 21:16:55.405835 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:55.905819973 +0000 UTC m=+149.344456734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.508608 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:55 crc kubenswrapper[4914]: E0130 21:16:55.509180 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:56.009167492 +0000 UTC m=+149.447804253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.509581 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wgrx5"] Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.537106 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.555175 4914 patch_prober.go:28] interesting pod/router-default-5444994796-f65q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:16:55 crc kubenswrapper[4914]: [-]has-synced failed: reason withheld Jan 30 21:16:55 crc kubenswrapper[4914]: [+]process-running ok Jan 30 21:16:55 crc kubenswrapper[4914]: healthz check failed Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.555227 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f65q2" podUID="a961d0f9-f1b6-4a3b-8c49-f03f1b797632" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.568396 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l6777" podStartSLOduration=128.568377523 podStartE2EDuration="2m8.568377523s" podCreationTimestamp="2026-01-30 21:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:55.510905374 +0000 UTC m=+148.949542135" watchObservedRunningTime="2026-01-30 21:16:55.568377523 +0000 UTC m=+149.007014284" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.570343 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-snrcw"] Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.583139 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56"] Jan 30 21:16:55 crc kubenswrapper[4914]: W0130 21:16:55.597247 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8602f25_310b_4d02_af41_fe47753bfcfe.slice/crio-230c850b66a96c5ae1d2c331caeac9013e63713ec26b64727bafbc1d3ce27e43 WatchSource:0}: Error finding container 230c850b66a96c5ae1d2c331caeac9013e63713ec26b64727bafbc1d3ce27e43: Status 404 returned error can't find the container with id 230c850b66a96c5ae1d2c331caeac9013e63713ec26b64727bafbc1d3ce27e43 Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.608424 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz"] Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.609811 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9"] Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.610274 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.612770 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.612835 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.612973 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:55 crc kubenswrapper[4914]: E0130 21:16:55.613110 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:56.113089234 +0000 UTC m=+149.551725995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.617233 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn"] Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.617561 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-f65q2" podStartSLOduration=127.617545822 podStartE2EDuration="2m7.617545822s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:55.608294058 +0000 UTC m=+149.046930819" watchObservedRunningTime="2026-01-30 21:16:55.617545822 +0000 UTC m=+149.056182583" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.624354 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.626173 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46"] Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.637215 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.642794 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5lrgb"] Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.644486 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9"] Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.650604 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs"] Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.651614 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.653759 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.658148 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd"] Jan 30 21:16:55 crc kubenswrapper[4914]: W0130 21:16:55.697902 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf492bda5_ee80_44ea_9c78_42d5bfd959da.slice/crio-f1fa294fcb4b66a6f1040c63c444afeec4236f7edadd2db5f2592c69adf8f9e4 WatchSource:0}: Error finding container f1fa294fcb4b66a6f1040c63c444afeec4236f7edadd2db5f2592c69adf8f9e4: Status 404 returned error can't find the container with id f1fa294fcb4b66a6f1040c63c444afeec4236f7edadd2db5f2592c69adf8f9e4 Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.714174 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.714257 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:55 crc kubenswrapper[4914]: E0130 21:16:55.714522 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:56.214511075 +0000 UTC m=+149.653147836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:55 crc kubenswrapper[4914]: W0130 21:16:55.722389 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5564eec_0f5f_407c_b34f_3d22c5f9921b.slice/crio-acb4b3719882fa15c67bae90971ef27a48e7773e80977021c0eaa091d0792788 WatchSource:0}: Error finding container acb4b3719882fa15c67bae90971ef27a48e7773e80977021c0eaa091d0792788: Status 404 returned error can't find the container with id acb4b3719882fa15c67bae90971ef27a48e7773e80977021c0eaa091d0792788 Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.723693 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.828107 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.828738 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs\") pod \"network-metrics-daemon-c2klk\" (UID: \"8a911963-1d06-47d0-8f70-d81d5bd47496\") " pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:55 crc kubenswrapper[4914]: E0130 21:16:55.829747 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:56.329731561 +0000 UTC m=+149.768368322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.837875 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a911963-1d06-47d0-8f70-d81d5bd47496-metrics-certs\") pod \"network-metrics-daemon-c2klk\" (UID: \"8a911963-1d06-47d0-8f70-d81d5bd47496\") " pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.843341 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.854592 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.884079 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fj2g8"] Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.894393 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fhksq"] Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.908887 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg"] Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.911134 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-85rbp"] Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.929875 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:55 crc kubenswrapper[4914]: E0130 21:16:55.930200 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:56.430188309 +0000 UTC m=+149.868825070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:55 crc kubenswrapper[4914]: I0130 21:16:55.947149 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c2klk" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.031779 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:56 crc kubenswrapper[4914]: E0130 21:16:56.032047 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:56.532034531 +0000 UTC m=+149.970671292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.046520 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" event={"ID":"95726c08-64b5-4c14-9eed-81815ea8efcb","Type":"ContainerStarted","Data":"7205366a2716dd306d447f13fbd2e5f8efbbf53a813700f8164498e4a52f4425"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.048882 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" event={"ID":"e050cbd0-653b-4d23-8a69-affa52be9608","Type":"ContainerStarted","Data":"29f7e9fa40626e2a6348c3b05bcdd84292e01e81ed5640d31d52315ca3a400c0"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.050556 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fhksq" event={"ID":"06adcbe1-93b1-4edd-b1a5-536f0c54043e","Type":"ContainerStarted","Data":"51c56b3e1df7cf37efb778394b55760b3b6be23cf8c393f661131c84f5c684e8"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.059182 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-md7dg" event={"ID":"564bf8fe-2efd-4e47-bbf5-f0dea6402178","Type":"ContainerStarted","Data":"2882efb703742435ce8e27b93104813bbce635cf0190fa11438a622eb4ccbe86"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.059396 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.065721 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz" event={"ID":"5cb07f77-973c-4d69-b6f4-d250599cf3a3","Type":"ContainerStarted","Data":"ba4967c14646da7ce257699053812c961ada8187a9bad8ca410e45a0a9d40817"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.066107 4914 patch_prober.go:28] interesting pod/console-operator-58897d9998-md7dg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.066138 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-md7dg" podUID="564bf8fe-2efd-4e47-bbf5-f0dea6402178" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.066599 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-whh9t" podStartSLOduration=128.066582956 podStartE2EDuration="2m8.066582956s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.063929392 +0000 UTC m=+149.502566143" watchObservedRunningTime="2026-01-30 21:16:56.066582956 +0000 UTC m=+149.505219717" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.076434 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-scclv" event={"ID":"77a21683-69d1-4459-aa95-cf4f0d33ec19","Type":"ContainerStarted","Data":"671cf4ea7c0f5d3ac6f4f9fdc516b550b7915210c32d3f66a2c947d7d6bf3547"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.080163 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tfjll" event={"ID":"a86e9a60-2314-425d-acae-d6611ca8b181","Type":"ContainerStarted","Data":"041e463ea0ef4971e6ab33f5a1cac8a9c7072c8683444f9360fc5b34e5347671"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.086838 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt" event={"ID":"ca8b735a-2235-4ff8-920e-f40483600c05","Type":"ContainerStarted","Data":"833768c081d18ffc56489f7c1abc617103649f15fbd924d955a073eda7e69518"} Jan 30 21:16:56 crc kubenswrapper[4914]: W0130 21:16:56.093655 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-118b901c03ac43f568f6bc5a1b346043acf6a8ba91acea162ffff46f8cfbf947 WatchSource:0}: Error finding container 118b901c03ac43f568f6bc5a1b346043acf6a8ba91acea162ffff46f8cfbf947: Status 404 returned error can't find the container with id 118b901c03ac43f568f6bc5a1b346043acf6a8ba91acea162ffff46f8cfbf947 Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.094147 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" event={"ID":"6b3718ea-66f6-4f01-97c5-94c7c844e1a0","Type":"ContainerStarted","Data":"aa5b9fec237ad26acfd9865043794697e37fad50caf924f8f9bb3431b0c6484e"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.096173 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-md7dg" podStartSLOduration=128.096026098 podStartE2EDuration="2m8.096026098s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.085410831 +0000 UTC m=+149.524047602" watchObservedRunningTime="2026-01-30 21:16:56.096026098 +0000 UTC m=+149.534662859" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.096679 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x7vxp" event={"ID":"54963e78-1698-4be2-925c-be7dc08c34a6","Type":"ContainerStarted","Data":"7510400f352d858299efe1887e7a8eb6ebe105e78b6124df94671b2e33323243"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.101363 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-scclv" podStartSLOduration=128.101346146 podStartE2EDuration="2m8.101346146s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.099397759 +0000 UTC m=+149.538034520" watchObservedRunningTime="2026-01-30 21:16:56.101346146 +0000 UTC m=+149.539982907" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.105788 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kwcbv" event={"ID":"dadb7c6b-fdea-413b-a5ef-7d7bc19a02c0","Type":"ContainerStarted","Data":"72e2a309941d3a7f0d093d806c3ee1d8dd18ab3f9526c07df4e502d769dfb3d5"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.114495 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x7vxp" podStartSLOduration=128.114477144 podStartE2EDuration="2m8.114477144s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.114289539 +0000 UTC m=+149.552926300" watchObservedRunningTime="2026-01-30 21:16:56.114477144 +0000 UTC m=+149.553113895" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.117868 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" event={"ID":"dfc55944-5aa9-4e66-b049-d109415b0f5e","Type":"ContainerStarted","Data":"14ffa3b6708d224bc9c7fa11fd186571f8edfa8b69a1a4ed876e477f007dd806"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.119329 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mx28l" event={"ID":"8a73fa67-f017-4a93-a8f5-6d2f753dcb37","Type":"ContainerStarted","Data":"0983ed9df36ab808d0170d051b7d4379bcaa9a521f547bcc046275644efeeacf"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.120644 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mx28l" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.123466 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" event={"ID":"c5564eec-0f5f-407c-b34f-3d22c5f9921b","Type":"ContainerStarted","Data":"acb4b3719882fa15c67bae90971ef27a48e7773e80977021c0eaa091d0792788"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.125168 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc" event={"ID":"427fbe21-4cfb-4e3f-868f-6b40ab37f9f6","Type":"ContainerStarted","Data":"58c2fca1827fecb6d1d246ca6496fa7c88e834abd8b56da5fd06b553eb08735d"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.126737 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" event={"ID":"401dfb2e-119a-487d-915a-b2bfdb275f74","Type":"ContainerStarted","Data":"a99404700bfe379980b77a50c18da1371c7969203eb1ba7d845c82fcaeb493da"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.128216 4914 patch_prober.go:28] interesting pod/downloads-7954f5f757-mx28l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.128252 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mx28l" podUID="8a73fa67-f017-4a93-a8f5-6d2f753dcb37" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.134874 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4p4nt" podStartSLOduration=128.134855666 podStartE2EDuration="2m8.134855666s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.131444414 +0000 UTC m=+149.570081175" watchObservedRunningTime="2026-01-30 21:16:56.134855666 +0000 UTC m=+149.573492427" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.135296 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8" event={"ID":"408e8313-53b0-4848-9d70-c99eaa88d122","Type":"ContainerStarted","Data":"8a1513ab62143c05f4b10821043dd1e4d10c99d26b7a3887751198d5176af8f6"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.136316 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:56 crc kubenswrapper[4914]: E0130 21:16:56.136636 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:56.636623639 +0000 UTC m=+150.075260400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.138257 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" event={"ID":"2a20a3b7-3c40-4816-a5b2-4d756cfbd948","Type":"ContainerStarted","Data":"8f1befc2dc60279a12641741cf710502cefc4da32dfb8fe63e2c4209fb152ff2"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.139020 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.140424 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5lrgb" event={"ID":"f492bda5-ee80-44ea-9c78-42d5bfd959da","Type":"ContainerStarted","Data":"f1fa294fcb4b66a6f1040c63c444afeec4236f7edadd2db5f2592c69adf8f9e4"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.141312 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fj2g8" event={"ID":"ff6855e6-7cae-432d-a7bf-6d3879ca88c3","Type":"ContainerStarted","Data":"1fc1420184cc7db4f26beaaf9b08b0725f99741429d27bb85ddec18a74e9c7b2"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.142235 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" event={"ID":"421cd57e-a15d-457a-a515-d071c7720f85","Type":"ContainerStarted","Data":"e1a424e45662846fad8c4565f24aa2c619931676b4b4350320c0986efc4ef476"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.143820 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46" event={"ID":"3846973a-e8a4-432a-b800-99b21bc0a93a","Type":"ContainerStarted","Data":"f66ccd8ff50b87e1980a4af2b94ce8c2e16845de50167abffa591546aaf08b62"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.148926 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5tbtc" event={"ID":"d1e4319f-4808-4c3b-8dfb-4002f1bd7885","Type":"ContainerStarted","Data":"7e5b8dc2686f1b85c3fb4133b5cec364887e04493c1f0b60ed1f1af91e02a3b8"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.163986 4914 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-w4dnw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" start-of-body= Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.164048 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" podUID="2a20a3b7-3c40-4816-a5b2-4d756cfbd948" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.173896 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" event={"ID":"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66","Type":"ContainerStarted","Data":"b013da72ddeeefda98b6c9fd3fa93acb309c15ad3544ac6f13596f65b59a41b6"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.174420 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.177851 4914 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2cd62 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" start-of-body= Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.177912 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" podUID="10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.179952 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p49xt" event={"ID":"e3882a6f-456e-4016-b6b1-76a916735c3b","Type":"ContainerStarted","Data":"5f91b6d5ef9f39bcf26418154db60e39e26f1b8218e043ac738b26f277afaf5a"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.192496 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-kwcbv" podStartSLOduration=6.192474549 podStartE2EDuration="6.192474549s" podCreationTimestamp="2026-01-30 21:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.182056077 +0000 UTC m=+149.620692838" watchObservedRunningTime="2026-01-30 21:16:56.192474549 +0000 UTC m=+149.631111310" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.192602 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-mx28l" podStartSLOduration=128.192597892 podStartE2EDuration="2m8.192597892s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.162059354 +0000 UTC m=+149.600696115" watchObservedRunningTime="2026-01-30 21:16:56.192597892 +0000 UTC m=+149.631234643" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.198091 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-snrcw" event={"ID":"a02a92b6-edd8-4ae5-871b-cea79ac68d5a","Type":"ContainerStarted","Data":"46a88cf9ba5f5be89f5299b9967fe45ef69873ccb0251ebba1f728b730d4277f"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.199976 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wgrx5" event={"ID":"489f08b5-b5d7-45a2-98e7-0c26139ed1d9","Type":"ContainerStarted","Data":"6fa5807f8be000ae19d8fb2ead6b4d932b40e8e9db0856abef495e0b5f15ca24"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.202052 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-59dcc" podStartSLOduration=128.202042951 podStartE2EDuration="2m8.202042951s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.198400243 +0000 UTC m=+149.637037004" watchObservedRunningTime="2026-01-30 21:16:56.202042951 +0000 UTC m=+149.640679712" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.212354 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" event={"ID":"e8602f25-310b-4d02-af41-fe47753bfcfe","Type":"ContainerStarted","Data":"230c850b66a96c5ae1d2c331caeac9013e63713ec26b64727bafbc1d3ce27e43"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.213759 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" event={"ID":"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc","Type":"ContainerStarted","Data":"db027857765b1fc2bdd884c7974fe92dd1fcbc6a49a614c10365565b650719c5"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.216904 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.218394 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nblc8" podStartSLOduration=128.218384476 podStartE2EDuration="2m8.218384476s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.217159816 +0000 UTC m=+149.655796567" watchObservedRunningTime="2026-01-30 21:16:56.218384476 +0000 UTC m=+149.657021237" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.220950 4914 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wx2ts container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.221043 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" podUID="6a2f6adb-e5cc-43f7-974d-11bae45ddbcc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.237059 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:56 crc kubenswrapper[4914]: E0130 21:16:56.238568 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:56.738548703 +0000 UTC m=+150.177185454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.273500 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-p49xt" podStartSLOduration=128.273485448 podStartE2EDuration="2m8.273485448s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.263152888 +0000 UTC m=+149.701789649" watchObservedRunningTime="2026-01-30 21:16:56.273485448 +0000 UTC m=+149.712122209" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.287748 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" event={"ID":"feb3e51e-2635-4659-bdb6-c3e72ed63b41","Type":"ContainerStarted","Data":"80f9b2574a13de07800855bfbcb9aa6b58f389273dee5e7809300e18af2e1550"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.298805 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9" event={"ID":"3f94df58-70b2-4856-8878-b0fc196d6f6d","Type":"ContainerStarted","Data":"7db186b42e382f9dd7ef796b343a416807b1367de4bb54ae0096504311739ad8"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.305688 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" podStartSLOduration=128.305674826 podStartE2EDuration="2m8.305674826s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.304924568 +0000 UTC m=+149.743561329" watchObservedRunningTime="2026-01-30 21:16:56.305674826 +0000 UTC m=+149.744311587" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.312213 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b" event={"ID":"c7e2d746-634f-4c1a-9d70-9a61db901650","Type":"ContainerStarted","Data":"cc5ff46aaa8d0028c5cdfc02fd6a2c7e39c61a058ddbe7ca92b534c4c1e2371a"} Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.342927 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.343587 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" podStartSLOduration=128.343565072 podStartE2EDuration="2m8.343565072s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.328075457 +0000 UTC m=+149.766712218" watchObservedRunningTime="2026-01-30 21:16:56.343565072 +0000 UTC m=+149.782201833" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.343885 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-30 21:11:55 +0000 UTC, rotation deadline is 2026-11-29 00:39:03.534238355 +0000 UTC Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.343963 4914 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7251h22m7.190277884s for next certificate rotation Jan 30 21:16:56 crc kubenswrapper[4914]: E0130 21:16:56.344144 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:56.844127655 +0000 UTC m=+150.282764416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.346535 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" podStartSLOduration=128.346523933 podStartE2EDuration="2m8.346523933s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.342094046 +0000 UTC m=+149.780730807" watchObservedRunningTime="2026-01-30 21:16:56.346523933 +0000 UTC m=+149.785160694" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.369530 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-l546h" podStartSLOduration=128.369516059 podStartE2EDuration="2m8.369516059s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.369037277 +0000 UTC m=+149.807674038" watchObservedRunningTime="2026-01-30 21:16:56.369516059 +0000 UTC m=+149.808152820" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.385723 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j876b" podStartSLOduration=128.38569298 podStartE2EDuration="2m8.38569298s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:56.384132412 +0000 UTC m=+149.822769173" watchObservedRunningTime="2026-01-30 21:16:56.38569298 +0000 UTC m=+149.824329741" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.445756 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:56 crc kubenswrapper[4914]: E0130 21:16:56.445996 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:56.945940236 +0000 UTC m=+150.384577117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.446359 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:56 crc kubenswrapper[4914]: E0130 21:16:56.447275 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:56.947264118 +0000 UTC m=+150.385900869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:56 crc kubenswrapper[4914]: W0130 21:16:56.477059 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-2369aef38c872b618565e256f2bfc760377e9a52b5bd390bce4f124ddb32b505 WatchSource:0}: Error finding container 2369aef38c872b618565e256f2bfc760377e9a52b5bd390bce4f124ddb32b505: Status 404 returned error can't find the container with id 2369aef38c872b618565e256f2bfc760377e9a52b5bd390bce4f124ddb32b505 Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.504245 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.504284 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.510698 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.535270 4914 patch_prober.go:28] interesting pod/router-default-5444994796-f65q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:16:56 crc kubenswrapper[4914]: [-]has-synced failed: reason withheld Jan 30 21:16:56 crc kubenswrapper[4914]: [+]process-running ok Jan 30 21:16:56 crc kubenswrapper[4914]: healthz check failed Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.535315 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f65q2" podUID="a961d0f9-f1b6-4a3b-8c49-f03f1b797632" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.548537 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:56 crc kubenswrapper[4914]: E0130 21:16:56.549015 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:57.048996528 +0000 UTC m=+150.487633289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.650154 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:56 crc kubenswrapper[4914]: E0130 21:16:56.650434 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:57.150419379 +0000 UTC m=+150.589056140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.753301 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:56 crc kubenswrapper[4914]: E0130 21:16:56.753450 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:57.253424819 +0000 UTC m=+150.692061580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.754018 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:56 crc kubenswrapper[4914]: E0130 21:16:56.754377 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:57.254362342 +0000 UTC m=+150.692999113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.786323 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c2klk"] Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.854803 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:56 crc kubenswrapper[4914]: E0130 21:16:56.855076 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:57.355055266 +0000 UTC m=+150.793692037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.955660 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:56 crc kubenswrapper[4914]: E0130 21:16:56.955984 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:57.455970075 +0000 UTC m=+150.894606836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.983288 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:16:56 crc kubenswrapper[4914]: I0130 21:16:56.983331 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.056195 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:57 crc kubenswrapper[4914]: E0130 21:16:57.056385 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:57.556355422 +0000 UTC m=+150.994992253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.157179 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:57 crc kubenswrapper[4914]: E0130 21:16:57.157610 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:57.657586309 +0000 UTC m=+151.096223090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.258320 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:57 crc kubenswrapper[4914]: E0130 21:16:57.258839 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:57.758823596 +0000 UTC m=+151.197460357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.333855 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c2klk" event={"ID":"8a911963-1d06-47d0-8f70-d81d5bd47496","Type":"ContainerStarted","Data":"4d4925136cc1cc81da676b85aaca28e4f353422cfd4c43dc6799778d1174d07e"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.333954 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c2klk" event={"ID":"8a911963-1d06-47d0-8f70-d81d5bd47496","Type":"ContainerStarted","Data":"8fe780d2bf5f27682d1ad80f1c613957e29aa572de74308ada5b16cc185ed502"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.354870 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5lrgb" event={"ID":"f492bda5-ee80-44ea-9c78-42d5bfd959da","Type":"ContainerStarted","Data":"69a3d0c2e64c34d8d63ca35cf85425aac75bc777f7277a7d4ef60e719a3e305d"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.361064 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:57 crc kubenswrapper[4914]: E0130 21:16:57.362688 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:57.862636486 +0000 UTC m=+151.301273247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.364026 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz" event={"ID":"5cb07f77-973c-4d69-b6f4-d250599cf3a3","Type":"ContainerStarted","Data":"f9870fa7e80e2b5bf53b689b2ad49223c37821f1bc1abb8a087f4a80964e5df3"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.377497 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg" event={"ID":"3efefb06-dccd-4432-8a91-9ac951803c21","Type":"ContainerStarted","Data":"3fe96248ada932a8a85fc7aa4ff4707a02f5bbdc50d86c59813aa724c5860aaa"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.405991 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" event={"ID":"421cd57e-a15d-457a-a515-d071c7720f85","Type":"ContainerStarted","Data":"f58c9323195f415244cebd6670a02a4a45c3199fabfb205765d36c2216100741"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.411069 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tfjll" event={"ID":"a86e9a60-2314-425d-acae-d6611ca8b181","Type":"ContainerStarted","Data":"f80ea125287a530555f17eb192a99d895f0ac430dac9214fc7400791318cacec"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.429281 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"20569d371c4fc758675ecc34fda610002a79b909f15efbc27cf181d930091cfe"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.429323 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2369aef38c872b618565e256f2bfc760377e9a52b5bd390bce4f124ddb32b505"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.430037 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.435363 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-snrcw" event={"ID":"a02a92b6-edd8-4ae5-871b-cea79ac68d5a","Type":"ContainerStarted","Data":"84c0f1355c4bb384d55a86655ec697a5475428d0db513dec86f2224029fc531a"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.442682 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ww6zz" podStartSLOduration=129.44266845 podStartE2EDuration="2m9.44266845s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:57.389544076 +0000 UTC m=+150.828180837" watchObservedRunningTime="2026-01-30 21:16:57.44266845 +0000 UTC m=+150.881305211" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.444205 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-tfjll" podStartSLOduration=129.444200907 podStartE2EDuration="2m9.444200907s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:57.442065766 +0000 UTC m=+150.880702527" watchObservedRunningTime="2026-01-30 21:16:57.444200907 +0000 UTC m=+150.882837668" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.451436 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wgrx5" event={"ID":"489f08b5-b5d7-45a2-98e7-0c26139ed1d9","Type":"ContainerStarted","Data":"ad4398f4cc279684497ba84bc0ffff40c3a2cfde061174c5e0fbe70d97b903ff"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.460246 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" event={"ID":"c5564eec-0f5f-407c-b34f-3d22c5f9921b","Type":"ContainerStarted","Data":"e0db32ec3471725205c5efddce7d81db2ccbbb795a6adb2964cdede8ea83f29b"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.461961 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.463825 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:57 crc kubenswrapper[4914]: E0130 21:16:57.463912 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:57.963895494 +0000 UTC m=+151.402532255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.464149 4914 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mg7r9 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.464161 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.464179 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" podUID="c5564eec-0f5f-407c-b34f-3d22c5f9921b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.467427 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fj2g8" event={"ID":"ff6855e6-7cae-432d-a7bf-6d3879ca88c3","Type":"ContainerStarted","Data":"53798cd819718909130ca1709461745bbd9bedcc1ef0093e0fadd743a89adc6b"} Jan 30 21:16:57 crc kubenswrapper[4914]: E0130 21:16:57.471276 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:57.971261112 +0000 UTC m=+151.409897873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.471636 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5tbtc" event={"ID":"d1e4319f-4808-4c3b-8dfb-4002f1bd7885","Type":"ContainerStarted","Data":"0d528ad6bf51c92d53e64c961502e86ecf9540a747b972b56a2abcfdd9f5cdd5"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.476042 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e9ddec20f9472365e4560474ceb0ee74a001e084ef1ccad4dbe53482e24ea932"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.476080 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"118b901c03ac43f568f6bc5a1b346043acf6a8ba91acea162ffff46f8cfbf947"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.492331 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-snrcw" podStartSLOduration=129.49231354 podStartE2EDuration="2m9.49231354s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:57.486826818 +0000 UTC m=+150.925463579" watchObservedRunningTime="2026-01-30 21:16:57.49231354 +0000 UTC m=+150.930950301" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.510420 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"24c823ed1b921470f110faf076510c1c1e1b94f604bc9eab072912f818caab1a"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.510472 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f527cef1db2fab7efb430b6387752c2436b919532b4f6b6ff3c918d61f3440d5"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.521982 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" event={"ID":"e8602f25-310b-4d02-af41-fe47753bfcfe","Type":"ContainerStarted","Data":"ab920dfcc4cc60ade59d2707e43853b63e64e88116c0bbb958fe5e1b54a8c0dc"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.522896 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.525921 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46" event={"ID":"3846973a-e8a4-432a-b800-99b21bc0a93a","Type":"ContainerStarted","Data":"77fa9a7333a99550df5570a687149c7d745381df3f222445a030f56d3104a82d"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.526896 4914 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-gwf56 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.526974 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" podUID="e8602f25-310b-4d02-af41-fe47753bfcfe" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.535182 4914 patch_prober.go:28] interesting pod/router-default-5444994796-f65q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:16:57 crc kubenswrapper[4914]: [-]has-synced failed: reason withheld Jan 30 21:16:57 crc kubenswrapper[4914]: [+]process-running ok Jan 30 21:16:57 crc kubenswrapper[4914]: healthz check failed Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.535220 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f65q2" podUID="a961d0f9-f1b6-4a3b-8c49-f03f1b797632" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.536090 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-5tbtc" podStartSLOduration=129.536053978 podStartE2EDuration="2m9.536053978s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:57.505748815 +0000 UTC m=+150.944385586" watchObservedRunningTime="2026-01-30 21:16:57.536053978 +0000 UTC m=+150.974690739" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.537756 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wgrx5" podStartSLOduration=7.537748189 podStartE2EDuration="7.537748189s" podCreationTimestamp="2026-01-30 21:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:57.529583601 +0000 UTC m=+150.968220362" watchObservedRunningTime="2026-01-30 21:16:57.537748189 +0000 UTC m=+150.976384950" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.549680 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" event={"ID":"401dfb2e-119a-487d-915a-b2bfdb275f74","Type":"ContainerStarted","Data":"908ab95905927ecfa8e91e28a2b8e687e6c77910bbc6195bb642c2280460db46"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.568170 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" event={"ID":"6b3718ea-66f6-4f01-97c5-94c7c844e1a0","Type":"ContainerStarted","Data":"eda6feda1c2f1634d7a3588a5364dcad3cd3c7476de0ff0ba6dfb7448149358b"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.572138 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.572968 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" podStartSLOduration=129.57294449 podStartE2EDuration="2m9.57294449s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:57.56842007 +0000 UTC m=+151.007056831" watchObservedRunningTime="2026-01-30 21:16:57.57294449 +0000 UTC m=+151.011581251" Jan 30 21:16:57 crc kubenswrapper[4914]: E0130 21:16:57.573853 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:58.073834611 +0000 UTC m=+151.512471372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.590275 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" event={"ID":"e050cbd0-653b-4d23-8a69-affa52be9608","Type":"ContainerStarted","Data":"1c5e4d37c10ce7fb0002fa4c617f9dd7f53d4300c4be72db9aa287a9f0ecb40d"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.602969 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" podStartSLOduration=129.602953415 podStartE2EDuration="2m9.602953415s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:57.602636107 +0000 UTC m=+151.041272868" watchObservedRunningTime="2026-01-30 21:16:57.602953415 +0000 UTC m=+151.041590176" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.626834 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9" event={"ID":"3f94df58-70b2-4856-8878-b0fc196d6f6d","Type":"ContainerStarted","Data":"f5417d7060013553635c09108035f71bc019cd11502af0efcf6d3927dd132be1"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.648354 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5stwz" podStartSLOduration=129.648328772 podStartE2EDuration="2m9.648328772s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:57.642270805 +0000 UTC m=+151.080907566" watchObservedRunningTime="2026-01-30 21:16:57.648328772 +0000 UTC m=+151.086965533" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.686072 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" event={"ID":"dfc55944-5aa9-4e66-b049-d109415b0f5e","Type":"ContainerStarted","Data":"d7836f2265617df1778980259fdc2660ec1c268dd03cdcd489c951a9c4f2c00e"} Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.689198 4914 patch_prober.go:28] interesting pod/downloads-7954f5f757-mx28l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.689236 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mx28l" podUID="8a73fa67-f017-4a93-a8f5-6d2f753dcb37" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.691371 4914 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-w4dnw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" start-of-body= Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.691412 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" podUID="2a20a3b7-3c40-4816-a5b2-4d756cfbd948" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.691539 4914 patch_prober.go:28] interesting pod/console-operator-58897d9998-md7dg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.691606 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-md7dg" podUID="564bf8fe-2efd-4e47-bbf5-f0dea6402178" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.692278 4914 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wx2ts container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.692300 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" podUID="6a2f6adb-e5cc-43f7-974d-11bae45ddbcc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.692355 4914 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2cd62 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" start-of-body= Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.692367 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" podUID="10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.692927 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:57 crc kubenswrapper[4914]: E0130 21:16:57.693661 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:58.193648447 +0000 UTC m=+151.632285208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.698872 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-h9fmd" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.717343 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cjmd9" podStartSLOduration=129.71732663 podStartE2EDuration="2m9.71732663s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:57.717032623 +0000 UTC m=+151.155669594" watchObservedRunningTime="2026-01-30 21:16:57.71732663 +0000 UTC m=+151.155963411" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.718163 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" podStartSLOduration=117.71815711 podStartE2EDuration="1m57.71815711s" podCreationTimestamp="2026-01-30 21:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:57.687790876 +0000 UTC m=+151.126427627" watchObservedRunningTime="2026-01-30 21:16:57.71815711 +0000 UTC m=+151.156793871" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.801463 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:57 crc kubenswrapper[4914]: E0130 21:16:57.803236 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:58.303221876 +0000 UTC m=+151.741858637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.814345 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.814588 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.845485 4914 patch_prober.go:28] interesting pod/apiserver-76f77b778f-tfjll container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.845526 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-tfjll" podUID="a86e9a60-2314-425d-acae-d6611ca8b181" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" Jan 30 21:16:57 crc kubenswrapper[4914]: I0130 21:16:57.907303 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:57 crc kubenswrapper[4914]: E0130 21:16:57.908439 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:58.408418749 +0000 UTC m=+151.847055520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.013968 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.014360 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:58.514339079 +0000 UTC m=+151.952975840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.115093 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.115724 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:58.61570332 +0000 UTC m=+152.054340081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.216626 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.216815 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:58.716787123 +0000 UTC m=+152.155423884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.216882 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.217161 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:58.717154002 +0000 UTC m=+152.155790763 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.323291 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.323471 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:58.823452482 +0000 UTC m=+152.262089243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.323538 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.323912 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:58.823900872 +0000 UTC m=+152.262537643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.424876 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.425102 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:58.925068138 +0000 UTC m=+152.363704899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.425356 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.425772 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:58.925756565 +0000 UTC m=+152.364393326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.526481 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.526639 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.026618093 +0000 UTC m=+152.465254854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.526777 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.527130 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.027121775 +0000 UTC m=+152.465758526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.535336 4914 patch_prober.go:28] interesting pod/router-default-5444994796-f65q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:16:58 crc kubenswrapper[4914]: [-]has-synced failed: reason withheld Jan 30 21:16:58 crc kubenswrapper[4914]: [+]process-running ok Jan 30 21:16:58 crc kubenswrapper[4914]: healthz check failed Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.535401 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f65q2" podUID="a961d0f9-f1b6-4a3b-8c49-f03f1b797632" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.628052 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.628232 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.128205058 +0000 UTC m=+152.566841819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.628284 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.628573 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.128562097 +0000 UTC m=+152.567198858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.691536 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" event={"ID":"6b3718ea-66f6-4f01-97c5-94c7c844e1a0","Type":"ContainerStarted","Data":"e5aac065b7f29141bfec56e142a805aac73d001466fbce5b3bf9e2112a3c3d45"} Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.693522 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5lrgb" event={"ID":"f492bda5-ee80-44ea-9c78-42d5bfd959da","Type":"ContainerStarted","Data":"627b2ee9679dd5078bb1af350a3310bdf97cee2bafe476959afe356c455d7594"} Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.694950 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fj2g8" event={"ID":"ff6855e6-7cae-432d-a7bf-6d3879ca88c3","Type":"ContainerStarted","Data":"ee473cef27b4beff56ecc7b3c6ec7c5eb7368ade14858d8490e5000d960c17f0"} Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.695000 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-fj2g8" Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.696290 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg" event={"ID":"3efefb06-dccd-4432-8a91-9ac951803c21","Type":"ContainerStarted","Data":"eb4a80266fdf132f5594cb0b4456426521ec52c59634d4442525d70feeb34041"} Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.696323 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg" event={"ID":"3efefb06-dccd-4432-8a91-9ac951803c21","Type":"ContainerStarted","Data":"82c841193d4799ad78733597c72157654f8d6976c455d9a81f0784b848626a5b"} Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.697783 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" event={"ID":"421cd57e-a15d-457a-a515-d071c7720f85","Type":"ContainerStarted","Data":"756ef979b2aeb8fddc875d25f3610a993d0469af196a0678282d52e5a8da7900"} Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.703950 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fhksq" event={"ID":"06adcbe1-93b1-4edd-b1a5-536f0c54043e","Type":"ContainerStarted","Data":"db6c79dfaa978e5152736c1ac57bf7f273c3d425ee21c629b4ad9244f999abd9"} Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.705698 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c2klk" event={"ID":"8a911963-1d06-47d0-8f70-d81d5bd47496","Type":"ContainerStarted","Data":"ae1761a309beed2afc123e95d83d66c51bead7737e0c0e6cab63019d089eaec5"} Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.707968 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" event={"ID":"dfc55944-5aa9-4e66-b049-d109415b0f5e","Type":"ContainerStarted","Data":"17f36e6705237a802586835a69004aeb8adb300b6c7906f37bc2e79296182985"} Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.710804 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46" event={"ID":"3846973a-e8a4-432a-b800-99b21bc0a93a","Type":"ContainerStarted","Data":"fa4bc78e615344f43095b83c466b705a1074c9e54428f357b92c8b70b8eb4afa"} Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.711361 4914 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wx2ts container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.711398 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" podUID="6a2f6adb-e5cc-43f7-974d-11bae45ddbcc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.711941 4914 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mg7r9 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.711983 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" podUID="c5564eec-0f5f-407c-b34f-3d22c5f9921b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.711876 4914 patch_prober.go:28] interesting pod/downloads-7954f5f757-mx28l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.712258 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mx28l" podUID="8a73fa67-f017-4a93-a8f5-6d2f753dcb37" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.716612 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gwf56" Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.729172 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-85rbp" podStartSLOduration=130.729159379 podStartE2EDuration="2m10.729159379s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:58.727041728 +0000 UTC m=+152.165678489" watchObservedRunningTime="2026-01-30 21:16:58.729159379 +0000 UTC m=+152.167796140" Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.729387 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.729541 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.229519127 +0000 UTC m=+152.668155888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.729572 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.729916 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.229905467 +0000 UTC m=+152.668542228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.756011 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-r9zkn" podStartSLOduration=130.755996397 podStartE2EDuration="2m10.755996397s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:58.754810768 +0000 UTC m=+152.193447529" watchObservedRunningTime="2026-01-30 21:16:58.755996397 +0000 UTC m=+152.194633148" Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.772648 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-648dg" podStartSLOduration=130.772632099 podStartE2EDuration="2m10.772632099s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:58.770758593 +0000 UTC m=+152.209395364" watchObservedRunningTime="2026-01-30 21:16:58.772632099 +0000 UTC m=+152.211268860" Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.810586 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46" podStartSLOduration=130.810573546 podStartE2EDuration="2m10.810573546s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:58.807566513 +0000 UTC m=+152.246203274" watchObservedRunningTime="2026-01-30 21:16:58.810573546 +0000 UTC m=+152.249210307" Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.830167 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.830435 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.330407615 +0000 UTC m=+152.769044376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.830983 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.831423 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.331406749 +0000 UTC m=+152.770043510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.847871 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hxlrs" podStartSLOduration=130.847856017 podStartE2EDuration="2m10.847856017s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:58.846982716 +0000 UTC m=+152.285619477" watchObservedRunningTime="2026-01-30 21:16:58.847856017 +0000 UTC m=+152.286492778" Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.898160 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-c2klk" podStartSLOduration=131.898140273 podStartE2EDuration="2m11.898140273s" podCreationTimestamp="2026-01-30 21:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:58.874522562 +0000 UTC m=+152.313159323" watchObservedRunningTime="2026-01-30 21:16:58.898140273 +0000 UTC m=+152.336777024" Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.933152 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fj2g8" podStartSLOduration=8.933133668 podStartE2EDuration="8.933133668s" podCreationTimestamp="2026-01-30 21:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:58.896599865 +0000 UTC m=+152.335236646" watchObservedRunningTime="2026-01-30 21:16:58.933133668 +0000 UTC m=+152.371770429" Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.933585 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-5lrgb" podStartSLOduration=130.933581169 podStartE2EDuration="2m10.933581169s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:16:58.931101579 +0000 UTC m=+152.369738340" watchObservedRunningTime="2026-01-30 21:16:58.933581169 +0000 UTC m=+152.372217930" Jan 30 21:16:58 crc kubenswrapper[4914]: I0130 21:16:58.934357 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:58 crc kubenswrapper[4914]: E0130 21:16:58.934608 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.434597124 +0000 UTC m=+152.873233875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.035672 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.035964 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.535953914 +0000 UTC m=+152.974590665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.136774 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.137045 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.637031217 +0000 UTC m=+153.075667978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.238221 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.238548 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.738536491 +0000 UTC m=+153.177173252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.339423 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.339601 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.839577774 +0000 UTC m=+153.278214525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.339650 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.339953 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.839941062 +0000 UTC m=+153.278577823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.441161 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.441325 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.941301322 +0000 UTC m=+153.379938083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.441411 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.441728 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:16:59.941720143 +0000 UTC m=+153.380356904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.465946 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-w4dnw" Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.534358 4914 patch_prober.go:28] interesting pod/router-default-5444994796-f65q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:16:59 crc kubenswrapper[4914]: [-]has-synced failed: reason withheld Jan 30 21:16:59 crc kubenswrapper[4914]: [+]process-running ok Jan 30 21:16:59 crc kubenswrapper[4914]: healthz check failed Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.534753 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f65q2" podUID="a961d0f9-f1b6-4a3b-8c49-f03f1b797632" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.542925 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.543104 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.043078403 +0000 UTC m=+153.481715164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.543148 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.543623 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.043617046 +0000 UTC m=+153.482253807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.644269 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.644399 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.144383062 +0000 UTC m=+153.583019823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.644429 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.644717 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.144698299 +0000 UTC m=+153.583335060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.716435 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46" Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.728719 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7r9" Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.745972 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.746154 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.246118751 +0000 UTC m=+153.684755512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.746380 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.746653 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.246646324 +0000 UTC m=+153.685283085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.847199 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.847380 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.347354818 +0000 UTC m=+153.785991579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.848056 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.851775 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.351763195 +0000 UTC m=+153.790399956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.949381 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.949733 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.449713682 +0000 UTC m=+153.888350443 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:59 crc kubenswrapper[4914]: I0130 21:16:59.950055 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:16:59 crc kubenswrapper[4914]: E0130 21:16:59.950342 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.450333117 +0000 UTC m=+153.888969878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.051946 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.052273 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.552256121 +0000 UTC m=+153.990892882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.052639 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.052951 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.552943408 +0000 UTC m=+153.991580169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.154097 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.154390 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.65437458 +0000 UTC m=+154.093011341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.154769 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.155067 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.655059046 +0000 UTC m=+154.093695807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.256425 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.256601 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.75657529 +0000 UTC m=+154.195212051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.256729 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.257025 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.757013051 +0000 UTC m=+154.195649812 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.357910 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.358087 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.858063454 +0000 UTC m=+154.296700215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.358201 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.358455 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.858444083 +0000 UTC m=+154.297080844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.458929 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.459233 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:17:00.959205668 +0000 UTC m=+154.397842429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.533974 4914 patch_prober.go:28] interesting pod/router-default-5444994796-f65q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:17:00 crc kubenswrapper[4914]: [-]has-synced failed: reason withheld Jan 30 21:17:00 crc kubenswrapper[4914]: [+]process-running ok Jan 30 21:17:00 crc kubenswrapper[4914]: healthz check failed Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.534463 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f65q2" podUID="a961d0f9-f1b6-4a3b-8c49-f03f1b797632" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.560230 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.560534 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:17:01.060519097 +0000 UTC m=+154.499155858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.661802 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.662129 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:17:01.162115313 +0000 UTC m=+154.600752074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.723798 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fhksq" event={"ID":"06adcbe1-93b1-4edd-b1a5-536f0c54043e","Type":"ContainerStarted","Data":"26e19a4f9b1ed5fa6781e6bd918b36fc864364b1154943e4043abba5bf61abc6"} Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.724084 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fhksq" event={"ID":"06adcbe1-93b1-4edd-b1a5-536f0c54043e","Type":"ContainerStarted","Data":"fcc5129a7c8c28c049ff859178959b26ef8d7c8ea6db147ed5d216b07c8a1d37"} Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.763318 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.763768 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:17:01.26374415 +0000 UTC m=+154.702380921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.796786 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-42klg"] Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.797895 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.800831 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.815342 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-42klg"] Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.864081 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.864244 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:17:01.364216019 +0000 UTC m=+154.802852780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.864395 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.864805 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:17:01.364789313 +0000 UTC m=+154.803426074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.865576 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkdr8\" (UniqueName: \"kubernetes.io/projected/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-kube-api-access-xkdr8\") pod \"certified-operators-42klg\" (UID: \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\") " pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.866126 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-catalog-content\") pod \"certified-operators-42klg\" (UID: \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\") " pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.867554 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-utilities\") pod \"certified-operators-42klg\" (UID: \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\") " pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.933093 4914 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.968392 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.968523 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:17:01.46850701 +0000 UTC m=+154.907143771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.968568 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-catalog-content\") pod \"certified-operators-42klg\" (UID: \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\") " pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.968606 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-utilities\") pod \"certified-operators-42klg\" (UID: \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\") " pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.968647 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.968688 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkdr8\" (UniqueName: \"kubernetes.io/projected/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-kube-api-access-xkdr8\") pod \"certified-operators-42klg\" (UID: \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\") " pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.968974 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-catalog-content\") pod \"certified-operators-42klg\" (UID: \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\") " pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.969076 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-utilities\") pod \"certified-operators-42klg\" (UID: \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\") " pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:17:00 crc kubenswrapper[4914]: E0130 21:17:00.969195 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:17:01.469187456 +0000 UTC m=+154.907824217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6tfzr" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:00 crc kubenswrapper[4914]: I0130 21:17:00.992105 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z85fs"] Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.001047 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.005174 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.020599 4914 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-30T21:17:00.933120974Z","Handler":null,"Name":""} Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.029683 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkdr8\" (UniqueName: \"kubernetes.io/projected/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-kube-api-access-xkdr8\") pod \"certified-operators-42klg\" (UID: \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\") " pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.033235 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z85fs"] Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.037206 4914 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.037413 4914 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.069776 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.070077 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w5t2\" (UniqueName: \"kubernetes.io/projected/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-kube-api-access-5w5t2\") pod \"community-operators-z85fs\" (UID: \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\") " pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.070154 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-utilities\") pod \"community-operators-z85fs\" (UID: \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\") " pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.070205 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-catalog-content\") pod \"community-operators-z85fs\" (UID: \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\") " pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.073937 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.120045 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.171913 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.172273 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-utilities\") pod \"community-operators-z85fs\" (UID: \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\") " pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.172325 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-catalog-content\") pod \"community-operators-z85fs\" (UID: \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\") " pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.172374 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w5t2\" (UniqueName: \"kubernetes.io/projected/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-kube-api-access-5w5t2\") pod \"community-operators-z85fs\" (UID: \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\") " pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.173166 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-catalog-content\") pod \"community-operators-z85fs\" (UID: \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\") " pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.175731 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-utilities\") pod \"community-operators-z85fs\" (UID: \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\") " pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.178346 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.179313 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.183933 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.184206 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.195134 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.199830 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mchvp"] Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.203273 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.214134 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w5t2\" (UniqueName: \"kubernetes.io/projected/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-kube-api-access-5w5t2\") pod \"community-operators-z85fs\" (UID: \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\") " pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.217483 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mchvp"] Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.273859 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ncf4\" (UniqueName: \"kubernetes.io/projected/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-kube-api-access-2ncf4\") pod \"certified-operators-mchvp\" (UID: \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\") " pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.273921 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4efdf060-a1d1-490b-9aa5-29084e680131-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4efdf060-a1d1-490b-9aa5-29084e680131\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.274001 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-utilities\") pod \"certified-operators-mchvp\" (UID: \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\") " pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.274023 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4efdf060-a1d1-490b-9aa5-29084e680131-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4efdf060-a1d1-490b-9aa5-29084e680131\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.274138 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-catalog-content\") pod \"certified-operators-mchvp\" (UID: \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\") " pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.316198 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.316546 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.354997 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-42klg"] Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.378443 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.379034 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-utilities\") pod \"certified-operators-mchvp\" (UID: \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\") " pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.379184 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4efdf060-a1d1-490b-9aa5-29084e680131-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4efdf060-a1d1-490b-9aa5-29084e680131\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.379354 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-catalog-content\") pod \"certified-operators-mchvp\" (UID: \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\") " pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.379471 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ncf4\" (UniqueName: \"kubernetes.io/projected/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-kube-api-access-2ncf4\") pod \"certified-operators-mchvp\" (UID: \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\") " pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.379561 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4efdf060-a1d1-490b-9aa5-29084e680131-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4efdf060-a1d1-490b-9aa5-29084e680131\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.379737 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4efdf060-a1d1-490b-9aa5-29084e680131-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4efdf060-a1d1-490b-9aa5-29084e680131\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.380329 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-utilities\") pod \"certified-operators-mchvp\" (UID: \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\") " pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.380589 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-catalog-content\") pod \"certified-operators-mchvp\" (UID: \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\") " pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.389314 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v6j7j"] Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.390238 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.397507 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6tfzr\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.404417 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ncf4\" (UniqueName: \"kubernetes.io/projected/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-kube-api-access-2ncf4\") pod \"certified-operators-mchvp\" (UID: \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\") " pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.407307 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4efdf060-a1d1-490b-9aa5-29084e680131-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4efdf060-a1d1-490b-9aa5-29084e680131\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.410382 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v6j7j"] Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.480416 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/928780fe-51a3-4e38-b573-31145d0a720c-catalog-content\") pod \"community-operators-v6j7j\" (UID: \"928780fe-51a3-4e38-b573-31145d0a720c\") " pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.480455 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbbx6\" (UniqueName: \"kubernetes.io/projected/928780fe-51a3-4e38-b573-31145d0a720c-kube-api-access-xbbx6\") pod \"community-operators-v6j7j\" (UID: \"928780fe-51a3-4e38-b573-31145d0a720c\") " pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.480496 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/928780fe-51a3-4e38-b573-31145d0a720c-utilities\") pod \"community-operators-v6j7j\" (UID: \"928780fe-51a3-4e38-b573-31145d0a720c\") " pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.504999 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.534912 4914 patch_prober.go:28] interesting pod/router-default-5444994796-f65q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:17:01 crc kubenswrapper[4914]: [-]has-synced failed: reason withheld Jan 30 21:17:01 crc kubenswrapper[4914]: [+]process-running ok Jan 30 21:17:01 crc kubenswrapper[4914]: healthz check failed Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.534972 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f65q2" podUID="a961d0f9-f1b6-4a3b-8c49-f03f1b797632" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.543730 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.546733 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.581768 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/928780fe-51a3-4e38-b573-31145d0a720c-utilities\") pod \"community-operators-v6j7j\" (UID: \"928780fe-51a3-4e38-b573-31145d0a720c\") " pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.582038 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/928780fe-51a3-4e38-b573-31145d0a720c-catalog-content\") pod \"community-operators-v6j7j\" (UID: \"928780fe-51a3-4e38-b573-31145d0a720c\") " pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.582060 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbbx6\" (UniqueName: \"kubernetes.io/projected/928780fe-51a3-4e38-b573-31145d0a720c-kube-api-access-xbbx6\") pod \"community-operators-v6j7j\" (UID: \"928780fe-51a3-4e38-b573-31145d0a720c\") " pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.582741 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/928780fe-51a3-4e38-b573-31145d0a720c-catalog-content\") pod \"community-operators-v6j7j\" (UID: \"928780fe-51a3-4e38-b573-31145d0a720c\") " pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.582793 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/928780fe-51a3-4e38-b573-31145d0a720c-utilities\") pod \"community-operators-v6j7j\" (UID: \"928780fe-51a3-4e38-b573-31145d0a720c\") " pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.604531 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbbx6\" (UniqueName: \"kubernetes.io/projected/928780fe-51a3-4e38-b573-31145d0a720c-kube-api-access-xbbx6\") pod \"community-operators-v6j7j\" (UID: \"928780fe-51a3-4e38-b573-31145d0a720c\") " pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.618434 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z85fs"] Jan 30 21:17:01 crc kubenswrapper[4914]: W0130 21:17:01.658984 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e9ae93a_9017_4fbf_aac3_1a1bb8081f6b.slice/crio-b6a78715b11f3aa7fe1d9ae8702fae502b3c24ec7804ce39f4d9d9a1ddad584f WatchSource:0}: Error finding container b6a78715b11f3aa7fe1d9ae8702fae502b3c24ec7804ce39f4d9d9a1ddad584f: Status 404 returned error can't find the container with id b6a78715b11f3aa7fe1d9ae8702fae502b3c24ec7804ce39f4d9d9a1ddad584f Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.729998 4914 generic.go:334] "Generic (PLEG): container finished" podID="d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" containerID="c88984954f157771adfc22785cda8de9fee7379cf6cf5bde8555bc32fa03831b" exitCode=0 Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.730200 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42klg" event={"ID":"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf","Type":"ContainerDied","Data":"c88984954f157771adfc22785cda8de9fee7379cf6cf5bde8555bc32fa03831b"} Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.730226 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42klg" event={"ID":"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf","Type":"ContainerStarted","Data":"380af97274196b65d60ae72bd7c08f545cb5caf72178eb55548b3eee65510884"} Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.732122 4914 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.737385 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fhksq" event={"ID":"06adcbe1-93b1-4edd-b1a5-536f0c54043e","Type":"ContainerStarted","Data":"0bd3d0b3f922b2ac21710cbd9cbaac83c123109e5400d61e16e5e7ba52c4f58c"} Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.738753 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z85fs" event={"ID":"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b","Type":"ContainerStarted","Data":"b6a78715b11f3aa7fe1d9ae8702fae502b3c24ec7804ce39f4d9d9a1ddad584f"} Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.771583 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-fhksq" podStartSLOduration=11.771558492 podStartE2EDuration="11.771558492s" podCreationTimestamp="2026-01-30 21:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:01.76569582 +0000 UTC m=+155.204332581" watchObservedRunningTime="2026-01-30 21:17:01.771558492 +0000 UTC m=+155.210195253" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.776330 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.857511 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 30 21:17:01 crc kubenswrapper[4914]: I0130 21:17:01.982495 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6tfzr"] Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.030176 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.109100 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mchvp"] Jan 30 21:17:02 crc kubenswrapper[4914]: W0130 21:17:02.116235 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc57ab23_e2b4_42f7_a4ac_d1cb1871e964.slice/crio-c90059274740e274654a112979d08ee60eca8dbc1972fe6ba282d5b10514b137 WatchSource:0}: Error finding container c90059274740e274654a112979d08ee60eca8dbc1972fe6ba282d5b10514b137: Status 404 returned error can't find the container with id c90059274740e274654a112979d08ee60eca8dbc1972fe6ba282d5b10514b137 Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.146226 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v6j7j"] Jan 30 21:17:02 crc kubenswrapper[4914]: W0130 21:17:02.161148 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod928780fe_51a3_4e38_b573_31145d0a720c.slice/crio-5831f457691f8c908722163d4c98c901c7ab8d4290a21ae09273e8c9aadda7e0 WatchSource:0}: Error finding container 5831f457691f8c908722163d4c98c901c7ab8d4290a21ae09273e8c9aadda7e0: Status 404 returned error can't find the container with id 5831f457691f8c908722163d4c98c901c7ab8d4290a21ae09273e8c9aadda7e0 Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.535896 4914 patch_prober.go:28] interesting pod/router-default-5444994796-f65q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:17:02 crc kubenswrapper[4914]: [-]has-synced failed: reason withheld Jan 30 21:17:02 crc kubenswrapper[4914]: [+]process-running ok Jan 30 21:17:02 crc kubenswrapper[4914]: healthz check failed Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.536162 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f65q2" podUID="a961d0f9-f1b6-4a3b-8c49-f03f1b797632" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.745717 4914 generic.go:334] "Generic (PLEG): container finished" podID="e050cbd0-653b-4d23-8a69-affa52be9608" containerID="1c5e4d37c10ce7fb0002fa4c617f9dd7f53d4300c4be72db9aa287a9f0ecb40d" exitCode=0 Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.745774 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" event={"ID":"e050cbd0-653b-4d23-8a69-affa52be9608","Type":"ContainerDied","Data":"1c5e4d37c10ce7fb0002fa4c617f9dd7f53d4300c4be72db9aa287a9f0ecb40d"} Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.747649 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4efdf060-a1d1-490b-9aa5-29084e680131","Type":"ContainerStarted","Data":"952d8445fd659b998454ed17f89babf7e5f11234c36f33b20ec0e1ad3f28411d"} Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.747750 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4efdf060-a1d1-490b-9aa5-29084e680131","Type":"ContainerStarted","Data":"ec57467918d4368e38b41ca8ee59ef6cf21e84d1d83bf22a629cc5aa5050e4d4"} Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.750743 4914 generic.go:334] "Generic (PLEG): container finished" podID="6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" containerID="f5ee530474bc75f60291b433415f94978996daa68c1d8a593f40cb3cb82a8824" exitCode=0 Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.751034 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z85fs" event={"ID":"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b","Type":"ContainerDied","Data":"f5ee530474bc75f60291b433415f94978996daa68c1d8a593f40cb3cb82a8824"} Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.753227 4914 generic.go:334] "Generic (PLEG): container finished" podID="cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" containerID="77f00ee02b24543c1fb4cb7c11e07f9c86a780537f191e91a069056a6bb6828a" exitCode=0 Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.753275 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mchvp" event={"ID":"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964","Type":"ContainerDied","Data":"77f00ee02b24543c1fb4cb7c11e07f9c86a780537f191e91a069056a6bb6828a"} Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.753327 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mchvp" event={"ID":"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964","Type":"ContainerStarted","Data":"c90059274740e274654a112979d08ee60eca8dbc1972fe6ba282d5b10514b137"} Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.759792 4914 generic.go:334] "Generic (PLEG): container finished" podID="928780fe-51a3-4e38-b573-31145d0a720c" containerID="b3b6fe6dd77a12c107f2db15d939099ad27d73a7bf1ce66c8fee111f29723818" exitCode=0 Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.759919 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6j7j" event={"ID":"928780fe-51a3-4e38-b573-31145d0a720c","Type":"ContainerDied","Data":"b3b6fe6dd77a12c107f2db15d939099ad27d73a7bf1ce66c8fee111f29723818"} Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.759973 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6j7j" event={"ID":"928780fe-51a3-4e38-b573-31145d0a720c","Type":"ContainerStarted","Data":"5831f457691f8c908722163d4c98c901c7ab8d4290a21ae09273e8c9aadda7e0"} Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.761660 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" event={"ID":"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c","Type":"ContainerStarted","Data":"47506032b17868a7229ae1db342767e0af49e33c4c4789c98059aa6f81a2d326"} Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.761697 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" event={"ID":"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c","Type":"ContainerStarted","Data":"01b066e3de4e7185a85679efccab79cc6cedd94d4a590d317e82fa5339a3f4e6"} Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.764370 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.818141 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.823381 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-tfjll" Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.824813 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.824798331 podStartE2EDuration="1.824798331s" podCreationTimestamp="2026-01-30 21:17:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:02.822137167 +0000 UTC m=+156.260773948" watchObservedRunningTime="2026-01-30 21:17:02.824798331 +0000 UTC m=+156.263435092" Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.858152 4914 patch_prober.go:28] interesting pod/downloads-7954f5f757-mx28l container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.858202 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mx28l" podUID="8a73fa67-f017-4a93-a8f5-6d2f753dcb37" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.858217 4914 patch_prober.go:28] interesting pod/downloads-7954f5f757-mx28l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.858274 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mx28l" podUID="8a73fa67-f017-4a93-a8f5-6d2f753dcb37" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.865986 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" podStartSLOduration=134.865967376 podStartE2EDuration="2m14.865967376s" podCreationTimestamp="2026-01-30 21:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:02.865852933 +0000 UTC m=+156.304489704" watchObservedRunningTime="2026-01-30 21:17:02.865967376 +0000 UTC m=+156.304604137" Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.915347 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.915529 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.928285 4914 patch_prober.go:28] interesting pod/console-f9d7485db-scclv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.928366 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-scclv" podUID="77a21683-69d1-4459-aa95-cf4f0d33ec19" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.987527 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f8nmx"] Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.988479 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.993723 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 21:17:02 crc kubenswrapper[4914]: I0130 21:17:02.999330 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8nmx"] Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.068886 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.114419 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfdb54ed-594a-4867-b500-68bdd392ce12-utilities\") pod \"redhat-marketplace-f8nmx\" (UID: \"cfdb54ed-594a-4867-b500-68bdd392ce12\") " pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.114471 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfdb54ed-594a-4867-b500-68bdd392ce12-catalog-content\") pod \"redhat-marketplace-f8nmx\" (UID: \"cfdb54ed-594a-4867-b500-68bdd392ce12\") " pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.114495 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lngzb\" (UniqueName: \"kubernetes.io/projected/cfdb54ed-594a-4867-b500-68bdd392ce12-kube-api-access-lngzb\") pod \"redhat-marketplace-f8nmx\" (UID: \"cfdb54ed-594a-4867-b500-68bdd392ce12\") " pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.142590 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-md7dg" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.216165 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfdb54ed-594a-4867-b500-68bdd392ce12-utilities\") pod \"redhat-marketplace-f8nmx\" (UID: \"cfdb54ed-594a-4867-b500-68bdd392ce12\") " pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.216281 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfdb54ed-594a-4867-b500-68bdd392ce12-catalog-content\") pod \"redhat-marketplace-f8nmx\" (UID: \"cfdb54ed-594a-4867-b500-68bdd392ce12\") " pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.216304 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lngzb\" (UniqueName: \"kubernetes.io/projected/cfdb54ed-594a-4867-b500-68bdd392ce12-kube-api-access-lngzb\") pod \"redhat-marketplace-f8nmx\" (UID: \"cfdb54ed-594a-4867-b500-68bdd392ce12\") " pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.217560 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfdb54ed-594a-4867-b500-68bdd392ce12-catalog-content\") pod \"redhat-marketplace-f8nmx\" (UID: \"cfdb54ed-594a-4867-b500-68bdd392ce12\") " pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.218687 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfdb54ed-594a-4867-b500-68bdd392ce12-utilities\") pod \"redhat-marketplace-f8nmx\" (UID: \"cfdb54ed-594a-4867-b500-68bdd392ce12\") " pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.235532 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lngzb\" (UniqueName: \"kubernetes.io/projected/cfdb54ed-594a-4867-b500-68bdd392ce12-kube-api-access-lngzb\") pod \"redhat-marketplace-f8nmx\" (UID: \"cfdb54ed-594a-4867-b500-68bdd392ce12\") " pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.307375 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.392377 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8vwgc"] Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.393454 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.414537 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8vwgc"] Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.521805 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e7c4a68-42c2-451f-b4ab-411361c45c63-catalog-content\") pod \"redhat-marketplace-8vwgc\" (UID: \"6e7c4a68-42c2-451f-b4ab-411361c45c63\") " pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.521873 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e7c4a68-42c2-451f-b4ab-411361c45c63-utilities\") pod \"redhat-marketplace-8vwgc\" (UID: \"6e7c4a68-42c2-451f-b4ab-411361c45c63\") " pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.521919 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j7vx\" (UniqueName: \"kubernetes.io/projected/6e7c4a68-42c2-451f-b4ab-411361c45c63-kube-api-access-8j7vx\") pod \"redhat-marketplace-8vwgc\" (UID: \"6e7c4a68-42c2-451f-b4ab-411361c45c63\") " pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.532094 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.533246 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8nmx"] Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.534909 4914 patch_prober.go:28] interesting pod/router-default-5444994796-f65q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:17:03 crc kubenswrapper[4914]: [-]has-synced failed: reason withheld Jan 30 21:17:03 crc kubenswrapper[4914]: [+]process-running ok Jan 30 21:17:03 crc kubenswrapper[4914]: healthz check failed Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.534944 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f65q2" podUID="a961d0f9-f1b6-4a3b-8c49-f03f1b797632" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:17:03 crc kubenswrapper[4914]: W0130 21:17:03.550000 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfdb54ed_594a_4867_b500_68bdd392ce12.slice/crio-f8e2be05e690a5929edb936fcae3378d41ea84001f475930197010ef89a2edc1 WatchSource:0}: Error finding container f8e2be05e690a5929edb936fcae3378d41ea84001f475930197010ef89a2edc1: Status 404 returned error can't find the container with id f8e2be05e690a5929edb936fcae3378d41ea84001f475930197010ef89a2edc1 Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.628427 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j7vx\" (UniqueName: \"kubernetes.io/projected/6e7c4a68-42c2-451f-b4ab-411361c45c63-kube-api-access-8j7vx\") pod \"redhat-marketplace-8vwgc\" (UID: \"6e7c4a68-42c2-451f-b4ab-411361c45c63\") " pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.628483 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e7c4a68-42c2-451f-b4ab-411361c45c63-catalog-content\") pod \"redhat-marketplace-8vwgc\" (UID: \"6e7c4a68-42c2-451f-b4ab-411361c45c63\") " pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.628541 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e7c4a68-42c2-451f-b4ab-411361c45c63-utilities\") pod \"redhat-marketplace-8vwgc\" (UID: \"6e7c4a68-42c2-451f-b4ab-411361c45c63\") " pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.630016 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e7c4a68-42c2-451f-b4ab-411361c45c63-catalog-content\") pod \"redhat-marketplace-8vwgc\" (UID: \"6e7c4a68-42c2-451f-b4ab-411361c45c63\") " pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.630486 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e7c4a68-42c2-451f-b4ab-411361c45c63-utilities\") pod \"redhat-marketplace-8vwgc\" (UID: \"6e7c4a68-42c2-451f-b4ab-411361c45c63\") " pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.645129 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.650422 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j7vx\" (UniqueName: \"kubernetes.io/projected/6e7c4a68-42c2-451f-b4ab-411361c45c63-kube-api-access-8j7vx\") pod \"redhat-marketplace-8vwgc\" (UID: \"6e7c4a68-42c2-451f-b4ab-411361c45c63\") " pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.728147 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.810752 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.811353 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.819221 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.819473 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.892069 4914 generic.go:334] "Generic (PLEG): container finished" podID="4efdf060-a1d1-490b-9aa5-29084e680131" containerID="952d8445fd659b998454ed17f89babf7e5f11234c36f33b20ec0e1ad3f28411d" exitCode=0 Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.905760 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.905799 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8nmx" event={"ID":"cfdb54ed-594a-4867-b500-68bdd392ce12","Type":"ContainerStarted","Data":"a855e2c6e2cb25d8c669645986034d6797b267b35861d80788c2f6f68ffcbaa3"} Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.905818 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8nmx" event={"ID":"cfdb54ed-594a-4867-b500-68bdd392ce12","Type":"ContainerStarted","Data":"f8e2be05e690a5929edb936fcae3378d41ea84001f475930197010ef89a2edc1"} Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.905828 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4efdf060-a1d1-490b-9aa5-29084e680131","Type":"ContainerDied","Data":"952d8445fd659b998454ed17f89babf7e5f11234c36f33b20ec0e1ad3f28411d"} Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.937986 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.938091 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.985146 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4bzx9"] Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.986947 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.989390 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 21:17:03 crc kubenswrapper[4914]: I0130 21:17:03.999290 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4bzx9"] Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.039083 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.039162 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bspw\" (UniqueName: \"kubernetes.io/projected/e8b53784-6398-419a-84b0-65f2550636a5-kube-api-access-8bspw\") pod \"redhat-operators-4bzx9\" (UID: \"e8b53784-6398-419a-84b0-65f2550636a5\") " pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.039183 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8b53784-6398-419a-84b0-65f2550636a5-catalog-content\") pod \"redhat-operators-4bzx9\" (UID: \"e8b53784-6398-419a-84b0-65f2550636a5\") " pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.039239 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.039418 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8b53784-6398-419a-84b0-65f2550636a5-utilities\") pod \"redhat-operators-4bzx9\" (UID: \"e8b53784-6398-419a-84b0-65f2550636a5\") " pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.041150 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.067190 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.140751 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8b53784-6398-419a-84b0-65f2550636a5-utilities\") pod \"redhat-operators-4bzx9\" (UID: \"e8b53784-6398-419a-84b0-65f2550636a5\") " pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.140849 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bspw\" (UniqueName: \"kubernetes.io/projected/e8b53784-6398-419a-84b0-65f2550636a5-kube-api-access-8bspw\") pod \"redhat-operators-4bzx9\" (UID: \"e8b53784-6398-419a-84b0-65f2550636a5\") " pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.140867 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8b53784-6398-419a-84b0-65f2550636a5-catalog-content\") pod \"redhat-operators-4bzx9\" (UID: \"e8b53784-6398-419a-84b0-65f2550636a5\") " pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.141235 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8b53784-6398-419a-84b0-65f2550636a5-utilities\") pod \"redhat-operators-4bzx9\" (UID: \"e8b53784-6398-419a-84b0-65f2550636a5\") " pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.141305 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8b53784-6398-419a-84b0-65f2550636a5-catalog-content\") pod \"redhat-operators-4bzx9\" (UID: \"e8b53784-6398-419a-84b0-65f2550636a5\") " pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.150996 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.159196 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bspw\" (UniqueName: \"kubernetes.io/projected/e8b53784-6398-419a-84b0-65f2550636a5-kube-api-access-8bspw\") pod \"redhat-operators-4bzx9\" (UID: \"e8b53784-6398-419a-84b0-65f2550636a5\") " pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.160510 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.245152 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e050cbd0-653b-4d23-8a69-affa52be9608-config-volume\") pod \"e050cbd0-653b-4d23-8a69-affa52be9608\" (UID: \"e050cbd0-653b-4d23-8a69-affa52be9608\") " Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.245220 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e050cbd0-653b-4d23-8a69-affa52be9608-secret-volume\") pod \"e050cbd0-653b-4d23-8a69-affa52be9608\" (UID: \"e050cbd0-653b-4d23-8a69-affa52be9608\") " Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.245264 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhxkl\" (UniqueName: \"kubernetes.io/projected/e050cbd0-653b-4d23-8a69-affa52be9608-kube-api-access-vhxkl\") pod \"e050cbd0-653b-4d23-8a69-affa52be9608\" (UID: \"e050cbd0-653b-4d23-8a69-affa52be9608\") " Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.249543 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8vwgc"] Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.249583 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e050cbd0-653b-4d23-8a69-affa52be9608-config-volume" (OuterVolumeSpecName: "config-volume") pod "e050cbd0-653b-4d23-8a69-affa52be9608" (UID: "e050cbd0-653b-4d23-8a69-affa52be9608"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.256571 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e050cbd0-653b-4d23-8a69-affa52be9608-kube-api-access-vhxkl" (OuterVolumeSpecName: "kube-api-access-vhxkl") pod "e050cbd0-653b-4d23-8a69-affa52be9608" (UID: "e050cbd0-653b-4d23-8a69-affa52be9608"). InnerVolumeSpecName "kube-api-access-vhxkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.256861 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e050cbd0-653b-4d23-8a69-affa52be9608-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e050cbd0-653b-4d23-8a69-affa52be9608" (UID: "e050cbd0-653b-4d23-8a69-affa52be9608"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:17:04 crc kubenswrapper[4914]: W0130 21:17:04.284420 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e7c4a68_42c2_451f_b4ab_411361c45c63.slice/crio-b0a4b2c558f83b2a251b9d900b4c168876383122f57d5625a785ad547d3b31fb WatchSource:0}: Error finding container b0a4b2c558f83b2a251b9d900b4c168876383122f57d5625a785ad547d3b31fb: Status 404 returned error can't find the container with id b0a4b2c558f83b2a251b9d900b4c168876383122f57d5625a785ad547d3b31fb Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.309774 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.346227 4914 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e050cbd0-653b-4d23-8a69-affa52be9608-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.346570 4914 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e050cbd0-653b-4d23-8a69-affa52be9608-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.346584 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhxkl\" (UniqueName: \"kubernetes.io/projected/e050cbd0-653b-4d23-8a69-affa52be9608-kube-api-access-vhxkl\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.386716 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-76ck9"] Jan 30 21:17:04 crc kubenswrapper[4914]: E0130 21:17:04.386967 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e050cbd0-653b-4d23-8a69-affa52be9608" containerName="collect-profiles" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.386981 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="e050cbd0-653b-4d23-8a69-affa52be9608" containerName="collect-profiles" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.387186 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="e050cbd0-653b-4d23-8a69-affa52be9608" containerName="collect-profiles" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.388133 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.396688 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-76ck9"] Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.447999 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8zpd\" (UniqueName: \"kubernetes.io/projected/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-kube-api-access-h8zpd\") pod \"redhat-operators-76ck9\" (UID: \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\") " pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.448050 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-utilities\") pod \"redhat-operators-76ck9\" (UID: \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\") " pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.448110 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-catalog-content\") pod \"redhat-operators-76ck9\" (UID: \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\") " pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.539008 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4bzx9"] Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.542501 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.566293 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8zpd\" (UniqueName: \"kubernetes.io/projected/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-kube-api-access-h8zpd\") pod \"redhat-operators-76ck9\" (UID: \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\") " pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.566341 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-utilities\") pod \"redhat-operators-76ck9\" (UID: \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\") " pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.566385 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-catalog-content\") pod \"redhat-operators-76ck9\" (UID: \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\") " pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.566808 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-catalog-content\") pod \"redhat-operators-76ck9\" (UID: \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\") " pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.566940 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-f65q2" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.566937 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-utilities\") pod \"redhat-operators-76ck9\" (UID: \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\") " pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.620298 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8zpd\" (UniqueName: \"kubernetes.io/projected/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-kube-api-access-h8zpd\") pod \"redhat-operators-76ck9\" (UID: \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\") " pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.639099 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.741081 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.902259 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.902254 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd" event={"ID":"e050cbd0-653b-4d23-8a69-affa52be9608","Type":"ContainerDied","Data":"29f7e9fa40626e2a6348c3b05bcdd84292e01e81ed5640d31d52315ca3a400c0"} Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.902376 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29f7e9fa40626e2a6348c3b05bcdd84292e01e81ed5640d31d52315ca3a400c0" Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.905792 4914 generic.go:334] "Generic (PLEG): container finished" podID="cfdb54ed-594a-4867-b500-68bdd392ce12" containerID="a855e2c6e2cb25d8c669645986034d6797b267b35861d80788c2f6f68ffcbaa3" exitCode=0 Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.905867 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8nmx" event={"ID":"cfdb54ed-594a-4867-b500-68bdd392ce12","Type":"ContainerDied","Data":"a855e2c6e2cb25d8c669645986034d6797b267b35861d80788c2f6f68ffcbaa3"} Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.908841 4914 generic.go:334] "Generic (PLEG): container finished" podID="6e7c4a68-42c2-451f-b4ab-411361c45c63" containerID="e0543019f162bed939325c9c9581a1ce22733965da6ebdbac22ca576e2450847" exitCode=0 Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.908911 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8vwgc" event={"ID":"6e7c4a68-42c2-451f-b4ab-411361c45c63","Type":"ContainerDied","Data":"e0543019f162bed939325c9c9581a1ce22733965da6ebdbac22ca576e2450847"} Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.908944 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8vwgc" event={"ID":"6e7c4a68-42c2-451f-b4ab-411361c45c63","Type":"ContainerStarted","Data":"b0a4b2c558f83b2a251b9d900b4c168876383122f57d5625a785ad547d3b31fb"} Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.915454 4914 generic.go:334] "Generic (PLEG): container finished" podID="e8b53784-6398-419a-84b0-65f2550636a5" containerID="a2fe354cf389270fef530a8d62899781b625c890a229f71b72f0b4bf6c4fe824" exitCode=0 Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.915526 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4bzx9" event={"ID":"e8b53784-6398-419a-84b0-65f2550636a5","Type":"ContainerDied","Data":"a2fe354cf389270fef530a8d62899781b625c890a229f71b72f0b4bf6c4fe824"} Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.915547 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4bzx9" event={"ID":"e8b53784-6398-419a-84b0-65f2550636a5","Type":"ContainerStarted","Data":"06386b98914b712503f312d0b138a05635fc83e6c2a0e800d631af1ce9c26417"} Jan 30 21:17:04 crc kubenswrapper[4914]: I0130 21:17:04.920190 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8","Type":"ContainerStarted","Data":"93adc02a3c91d96315c3a4c7d35bcc84f9bec2a9298f5c8bf1607b6aff1ecd57"} Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.286511 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.369633 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-76ck9"] Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.382667 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4efdf060-a1d1-490b-9aa5-29084e680131-kube-api-access\") pod \"4efdf060-a1d1-490b-9aa5-29084e680131\" (UID: \"4efdf060-a1d1-490b-9aa5-29084e680131\") " Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.382780 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4efdf060-a1d1-490b-9aa5-29084e680131-kubelet-dir\") pod \"4efdf060-a1d1-490b-9aa5-29084e680131\" (UID: \"4efdf060-a1d1-490b-9aa5-29084e680131\") " Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.383158 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4efdf060-a1d1-490b-9aa5-29084e680131-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4efdf060-a1d1-490b-9aa5-29084e680131" (UID: "4efdf060-a1d1-490b-9aa5-29084e680131"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.389186 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4efdf060-a1d1-490b-9aa5-29084e680131-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4efdf060-a1d1-490b-9aa5-29084e680131" (UID: "4efdf060-a1d1-490b-9aa5-29084e680131"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.484726 4914 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4efdf060-a1d1-490b-9aa5-29084e680131-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.484807 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4efdf060-a1d1-490b-9aa5-29084e680131-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.941957 4914 generic.go:334] "Generic (PLEG): container finished" podID="f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8" containerID="28d3f689e814011f197112d34c912697cdd023ef1feb50b20db7caf2566918b0" exitCode=0 Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.942572 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8","Type":"ContainerDied","Data":"28d3f689e814011f197112d34c912697cdd023ef1feb50b20db7caf2566918b0"} Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.958272 4914 generic.go:334] "Generic (PLEG): container finished" podID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" containerID="e660781064cef24c7cc43acef415f48a74968c9e7526f08eb68b65e1597bc136" exitCode=0 Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.958361 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76ck9" event={"ID":"9d72dac1-62cd-4ab2-bf74-89c95d9762b0","Type":"ContainerDied","Data":"e660781064cef24c7cc43acef415f48a74968c9e7526f08eb68b65e1597bc136"} Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.958385 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76ck9" event={"ID":"9d72dac1-62cd-4ab2-bf74-89c95d9762b0","Type":"ContainerStarted","Data":"36c81e40799bd1b1146d692a8980aa818d74455ebde9a815cb34920f0f3be431"} Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.964569 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4efdf060-a1d1-490b-9aa5-29084e680131","Type":"ContainerDied","Data":"ec57467918d4368e38b41ca8ee59ef6cf21e84d1d83bf22a629cc5aa5050e4d4"} Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.964606 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec57467918d4368e38b41ca8ee59ef6cf21e84d1d83bf22a629cc5aa5050e4d4" Jan 30 21:17:05 crc kubenswrapper[4914]: I0130 21:17:05.964920 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:17:07 crc kubenswrapper[4914]: I0130 21:17:07.352941 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:17:07 crc kubenswrapper[4914]: I0130 21:17:07.435549 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8-kube-api-access\") pod \"f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8\" (UID: \"f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8\") " Jan 30 21:17:07 crc kubenswrapper[4914]: I0130 21:17:07.435697 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8-kubelet-dir\") pod \"f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8\" (UID: \"f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8\") " Jan 30 21:17:07 crc kubenswrapper[4914]: I0130 21:17:07.436242 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8" (UID: "f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:17:07 crc kubenswrapper[4914]: I0130 21:17:07.441599 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8" (UID: "f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:17:07 crc kubenswrapper[4914]: I0130 21:17:07.537594 4914 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:07 crc kubenswrapper[4914]: I0130 21:17:07.537628 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:08 crc kubenswrapper[4914]: I0130 21:17:08.042535 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8","Type":"ContainerDied","Data":"93adc02a3c91d96315c3a4c7d35bcc84f9bec2a9298f5c8bf1607b6aff1ecd57"} Jan 30 21:17:08 crc kubenswrapper[4914]: I0130 21:17:08.042788 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93adc02a3c91d96315c3a4c7d35bcc84f9bec2a9298f5c8bf1607b6aff1ecd57" Jan 30 21:17:08 crc kubenswrapper[4914]: I0130 21:17:08.042843 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:17:08 crc kubenswrapper[4914]: I0130 21:17:08.733212 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fj2g8" Jan 30 21:17:12 crc kubenswrapper[4914]: I0130 21:17:12.857410 4914 patch_prober.go:28] interesting pod/downloads-7954f5f757-mx28l container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jan 30 21:17:12 crc kubenswrapper[4914]: I0130 21:17:12.857533 4914 patch_prober.go:28] interesting pod/downloads-7954f5f757-mx28l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Jan 30 21:17:12 crc kubenswrapper[4914]: I0130 21:17:12.858290 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mx28l" podUID="8a73fa67-f017-4a93-a8f5-6d2f753dcb37" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jan 30 21:17:12 crc kubenswrapper[4914]: I0130 21:17:12.858353 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mx28l" podUID="8a73fa67-f017-4a93-a8f5-6d2f753dcb37" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Jan 30 21:17:12 crc kubenswrapper[4914]: I0130 21:17:12.903928 4914 patch_prober.go:28] interesting pod/console-f9d7485db-scclv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Jan 30 21:17:12 crc kubenswrapper[4914]: I0130 21:17:12.903982 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-scclv" podUID="77a21683-69d1-4459-aa95-cf4f0d33ec19" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Jan 30 21:17:13 crc kubenswrapper[4914]: I0130 21:17:13.423688 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pscbd"] Jan 30 21:17:13 crc kubenswrapper[4914]: I0130 21:17:13.440983 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" podUID="dff3a310-c986-4724-8862-6d609edb8612" containerName="controller-manager" containerID="cri-o://917f63e4bb780d9637d1222a52f82e6541b5d41edd0b80b535d176b614e455b1" gracePeriod=30 Jan 30 21:17:13 crc kubenswrapper[4914]: I0130 21:17:13.446476 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z"] Jan 30 21:17:13 crc kubenswrapper[4914]: I0130 21:17:13.446686 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" podUID="247e526c-e643-4ffb-a6f2-b4678132b8a7" containerName="route-controller-manager" containerID="cri-o://6c7673879cc11c7b85f34981f7ca4377f62dc33e66b668449724f01aceffef7f" gracePeriod=30 Jan 30 21:17:16 crc kubenswrapper[4914]: I0130 21:17:16.128956 4914 generic.go:334] "Generic (PLEG): container finished" podID="dff3a310-c986-4724-8862-6d609edb8612" containerID="917f63e4bb780d9637d1222a52f82e6541b5d41edd0b80b535d176b614e455b1" exitCode=0 Jan 30 21:17:16 crc kubenswrapper[4914]: I0130 21:17:16.129102 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" event={"ID":"dff3a310-c986-4724-8862-6d609edb8612","Type":"ContainerDied","Data":"917f63e4bb780d9637d1222a52f82e6541b5d41edd0b80b535d176b614e455b1"} Jan 30 21:17:16 crc kubenswrapper[4914]: I0130 21:17:16.139632 4914 generic.go:334] "Generic (PLEG): container finished" podID="247e526c-e643-4ffb-a6f2-b4678132b8a7" containerID="6c7673879cc11c7b85f34981f7ca4377f62dc33e66b668449724f01aceffef7f" exitCode=0 Jan 30 21:17:16 crc kubenswrapper[4914]: I0130 21:17:16.139689 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" event={"ID":"247e526c-e643-4ffb-a6f2-b4678132b8a7","Type":"ContainerDied","Data":"6c7673879cc11c7b85f34981f7ca4377f62dc33e66b668449724f01aceffef7f"} Jan 30 21:17:21 crc kubenswrapper[4914]: I0130 21:17:21.555813 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.177179 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" event={"ID":"dff3a310-c986-4724-8862-6d609edb8612","Type":"ContainerDied","Data":"6290973f00dcb777e9b3f56eae8a34f71120f7569285ff79653465dcb11c21f2"} Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.177236 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6290973f00dcb777e9b3f56eae8a34f71120f7569285ff79653465dcb11c21f2" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.178977 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" event={"ID":"247e526c-e643-4ffb-a6f2-b4678132b8a7","Type":"ContainerDied","Data":"2bd09aace6ad40f8b2797f0d33b32e98acb09b4c8205d69edda8ecb6f0ab792c"} Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.179023 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bd09aace6ad40f8b2797f0d33b32e98acb09b4c8205d69edda8ecb6f0ab792c" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.208624 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.223018 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.240865 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45"] Jan 30 21:17:22 crc kubenswrapper[4914]: E0130 21:17:22.241088 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efdf060-a1d1-490b-9aa5-29084e680131" containerName="pruner" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.241103 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efdf060-a1d1-490b-9aa5-29084e680131" containerName="pruner" Jan 30 21:17:22 crc kubenswrapper[4914]: E0130 21:17:22.241117 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247e526c-e643-4ffb-a6f2-b4678132b8a7" containerName="route-controller-manager" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.241125 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="247e526c-e643-4ffb-a6f2-b4678132b8a7" containerName="route-controller-manager" Jan 30 21:17:22 crc kubenswrapper[4914]: E0130 21:17:22.241136 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8" containerName="pruner" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.241144 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8" containerName="pruner" Jan 30 21:17:22 crc kubenswrapper[4914]: E0130 21:17:22.241159 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff3a310-c986-4724-8862-6d609edb8612" containerName="controller-manager" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.241167 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff3a310-c986-4724-8862-6d609edb8612" containerName="controller-manager" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.241272 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="247e526c-e643-4ffb-a6f2-b4678132b8a7" containerName="route-controller-manager" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.241286 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="4efdf060-a1d1-490b-9aa5-29084e680131" containerName="pruner" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.241301 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff3a310-c986-4724-8862-6d609edb8612" containerName="controller-manager" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.241316 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1214d82-c1fe-4903-8f0c-d94d4fe5f6d8" containerName="pruner" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.241699 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.257074 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45"] Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.366139 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/247e526c-e643-4ffb-a6f2-b4678132b8a7-config\") pod \"247e526c-e643-4ffb-a6f2-b4678132b8a7\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.366207 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/247e526c-e643-4ffb-a6f2-b4678132b8a7-client-ca\") pod \"247e526c-e643-4ffb-a6f2-b4678132b8a7\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.366261 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dff3a310-c986-4724-8862-6d609edb8612-serving-cert\") pod \"dff3a310-c986-4724-8862-6d609edb8612\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.366293 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-client-ca\") pod \"dff3a310-c986-4724-8862-6d609edb8612\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.366337 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-proxy-ca-bundles\") pod \"dff3a310-c986-4724-8862-6d609edb8612\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.366359 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-config\") pod \"dff3a310-c986-4724-8862-6d609edb8612\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.366408 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5t5k\" (UniqueName: \"kubernetes.io/projected/dff3a310-c986-4724-8862-6d609edb8612-kube-api-access-l5t5k\") pod \"dff3a310-c986-4724-8862-6d609edb8612\" (UID: \"dff3a310-c986-4724-8862-6d609edb8612\") " Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.366896 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7jvj\" (UniqueName: \"kubernetes.io/projected/247e526c-e643-4ffb-a6f2-b4678132b8a7-kube-api-access-r7jvj\") pod \"247e526c-e643-4ffb-a6f2-b4678132b8a7\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.367217 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-client-ca" (OuterVolumeSpecName: "client-ca") pod "dff3a310-c986-4724-8862-6d609edb8612" (UID: "dff3a310-c986-4724-8862-6d609edb8612"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.367239 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/247e526c-e643-4ffb-a6f2-b4678132b8a7-client-ca" (OuterVolumeSpecName: "client-ca") pod "247e526c-e643-4ffb-a6f2-b4678132b8a7" (UID: "247e526c-e643-4ffb-a6f2-b4678132b8a7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.367301 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/247e526c-e643-4ffb-a6f2-b4678132b8a7-serving-cert\") pod \"247e526c-e643-4ffb-a6f2-b4678132b8a7\" (UID: \"247e526c-e643-4ffb-a6f2-b4678132b8a7\") " Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.367293 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/247e526c-e643-4ffb-a6f2-b4678132b8a7-config" (OuterVolumeSpecName: "config") pod "247e526c-e643-4ffb-a6f2-b4678132b8a7" (UID: "247e526c-e643-4ffb-a6f2-b4678132b8a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.367479 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-config" (OuterVolumeSpecName: "config") pod "dff3a310-c986-4724-8862-6d609edb8612" (UID: "dff3a310-c986-4724-8862-6d609edb8612"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.367505 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4f5638-f3fb-4c85-bf14-15d149eca55f-config\") pod \"route-controller-manager-9f9b495d8-p4r45\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.367586 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc6qs\" (UniqueName: \"kubernetes.io/projected/ed4f5638-f3fb-4c85-bf14-15d149eca55f-kube-api-access-tc6qs\") pod \"route-controller-manager-9f9b495d8-p4r45\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.367686 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "dff3a310-c986-4724-8862-6d609edb8612" (UID: "dff3a310-c986-4724-8862-6d609edb8612"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.367798 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed4f5638-f3fb-4c85-bf14-15d149eca55f-serving-cert\") pod \"route-controller-manager-9f9b495d8-p4r45\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.367868 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed4f5638-f3fb-4c85-bf14-15d149eca55f-client-ca\") pod \"route-controller-manager-9f9b495d8-p4r45\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.368077 4914 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.368103 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.368125 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/247e526c-e643-4ffb-a6f2-b4678132b8a7-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.368144 4914 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/247e526c-e643-4ffb-a6f2-b4678132b8a7-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.368162 4914 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dff3a310-c986-4724-8862-6d609edb8612-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.372530 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff3a310-c986-4724-8862-6d609edb8612-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dff3a310-c986-4724-8862-6d609edb8612" (UID: "dff3a310-c986-4724-8862-6d609edb8612"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.373260 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247e526c-e643-4ffb-a6f2-b4678132b8a7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "247e526c-e643-4ffb-a6f2-b4678132b8a7" (UID: "247e526c-e643-4ffb-a6f2-b4678132b8a7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.374262 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff3a310-c986-4724-8862-6d609edb8612-kube-api-access-l5t5k" (OuterVolumeSpecName: "kube-api-access-l5t5k") pod "dff3a310-c986-4724-8862-6d609edb8612" (UID: "dff3a310-c986-4724-8862-6d609edb8612"). InnerVolumeSpecName "kube-api-access-l5t5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.380768 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247e526c-e643-4ffb-a6f2-b4678132b8a7-kube-api-access-r7jvj" (OuterVolumeSpecName: "kube-api-access-r7jvj") pod "247e526c-e643-4ffb-a6f2-b4678132b8a7" (UID: "247e526c-e643-4ffb-a6f2-b4678132b8a7"). InnerVolumeSpecName "kube-api-access-r7jvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.429956 4914 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-htz2z container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.430396 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" podUID="247e526c-e643-4ffb-a6f2-b4678132b8a7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.469753 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4f5638-f3fb-4c85-bf14-15d149eca55f-config\") pod \"route-controller-manager-9f9b495d8-p4r45\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.469811 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc6qs\" (UniqueName: \"kubernetes.io/projected/ed4f5638-f3fb-4c85-bf14-15d149eca55f-kube-api-access-tc6qs\") pod \"route-controller-manager-9f9b495d8-p4r45\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.469858 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed4f5638-f3fb-4c85-bf14-15d149eca55f-serving-cert\") pod \"route-controller-manager-9f9b495d8-p4r45\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.469933 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed4f5638-f3fb-4c85-bf14-15d149eca55f-client-ca\") pod \"route-controller-manager-9f9b495d8-p4r45\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.470027 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5t5k\" (UniqueName: \"kubernetes.io/projected/dff3a310-c986-4724-8862-6d609edb8612-kube-api-access-l5t5k\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.470043 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7jvj\" (UniqueName: \"kubernetes.io/projected/247e526c-e643-4ffb-a6f2-b4678132b8a7-kube-api-access-r7jvj\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.470054 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/247e526c-e643-4ffb-a6f2-b4678132b8a7-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.470066 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dff3a310-c986-4724-8862-6d609edb8612-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.472554 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed4f5638-f3fb-4c85-bf14-15d149eca55f-client-ca\") pod \"route-controller-manager-9f9b495d8-p4r45\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.474567 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4f5638-f3fb-4c85-bf14-15d149eca55f-config\") pod \"route-controller-manager-9f9b495d8-p4r45\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.476945 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed4f5638-f3fb-4c85-bf14-15d149eca55f-serving-cert\") pod \"route-controller-manager-9f9b495d8-p4r45\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.495196 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc6qs\" (UniqueName: \"kubernetes.io/projected/ed4f5638-f3fb-4c85-bf14-15d149eca55f-kube-api-access-tc6qs\") pod \"route-controller-manager-9f9b495d8-p4r45\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.574045 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.700795 4914 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pscbd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.700876 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" podUID="dff3a310-c986-4724-8862-6d609edb8612" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.865037 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-mx28l" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.906953 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:17:22 crc kubenswrapper[4914]: I0130 21:17:22.920018 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:17:23 crc kubenswrapper[4914]: I0130 21:17:23.183102 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z" Jan 30 21:17:23 crc kubenswrapper[4914]: I0130 21:17:23.183268 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pscbd" Jan 30 21:17:23 crc kubenswrapper[4914]: I0130 21:17:23.217302 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z"] Jan 30 21:17:23 crc kubenswrapper[4914]: I0130 21:17:23.223055 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-htz2z"] Jan 30 21:17:23 crc kubenswrapper[4914]: I0130 21:17:23.226949 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pscbd"] Jan 30 21:17:23 crc kubenswrapper[4914]: I0130 21:17:23.229990 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pscbd"] Jan 30 21:17:23 crc kubenswrapper[4914]: I0130 21:17:23.826801 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247e526c-e643-4ffb-a6f2-b4678132b8a7" path="/var/lib/kubelet/pods/247e526c-e643-4ffb-a6f2-b4678132b8a7/volumes" Jan 30 21:17:23 crc kubenswrapper[4914]: I0130 21:17:23.827576 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff3a310-c986-4724-8862-6d609edb8612" path="/var/lib/kubelet/pods/dff3a310-c986-4724-8862-6d609edb8612/volumes" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.652841 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4"] Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.654253 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.657306 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.658128 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.658415 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.659030 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.659097 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.659788 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.669121 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.669540 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4"] Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.717271 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-serving-cert\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.717585 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px7s9\" (UniqueName: \"kubernetes.io/projected/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-kube-api-access-px7s9\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.717764 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-client-ca\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.717888 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-proxy-ca-bundles\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.718053 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-config\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.819343 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px7s9\" (UniqueName: \"kubernetes.io/projected/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-kube-api-access-px7s9\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.819456 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-client-ca\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.819494 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-proxy-ca-bundles\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.819552 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-config\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.819650 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-serving-cert\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.878130 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-client-ca\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.878191 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-serving-cert\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.878981 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-proxy-ca-bundles\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.879225 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-config\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:24 crc kubenswrapper[4914]: I0130 21:17:24.880223 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px7s9\" (UniqueName: \"kubernetes.io/projected/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-kube-api-access-px7s9\") pod \"controller-manager-7f9fb947b7-5j2m4\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:25 crc kubenswrapper[4914]: I0130 21:17:25.042469 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:26 crc kubenswrapper[4914]: I0130 21:17:26.983645 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:17:26 crc kubenswrapper[4914]: I0130 21:17:26.984066 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:17:33 crc kubenswrapper[4914]: I0130 21:17:33.432979 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4"] Jan 30 21:17:33 crc kubenswrapper[4914]: I0130 21:17:33.520796 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45"] Jan 30 21:17:33 crc kubenswrapper[4914]: I0130 21:17:33.666131 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptx46" Jan 30 21:17:36 crc kubenswrapper[4914]: I0130 21:17:36.272557 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:36 crc kubenswrapper[4914]: E0130 21:17:36.826798 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 30 21:17:36 crc kubenswrapper[4914]: E0130 21:17:36.826940 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2ncf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mchvp_openshift-marketplace(cc57ab23-e2b4-42f7-a4ac-d1cb1871e964): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:17:36 crc kubenswrapper[4914]: E0130 21:17:36.828144 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mchvp" podUID="cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" Jan 30 21:17:40 crc kubenswrapper[4914]: I0130 21:17:40.998223 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 21:17:41 crc kubenswrapper[4914]: I0130 21:17:40.999628 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:17:41 crc kubenswrapper[4914]: I0130 21:17:41.006854 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 21:17:41 crc kubenswrapper[4914]: I0130 21:17:41.006984 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 21:17:41 crc kubenswrapper[4914]: I0130 21:17:41.021612 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 21:17:41 crc kubenswrapper[4914]: I0130 21:17:41.083972 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a290a9f3-c270-4fc2-a033-ff9d012d53c7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a290a9f3-c270-4fc2-a033-ff9d012d53c7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:17:41 crc kubenswrapper[4914]: I0130 21:17:41.084058 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a290a9f3-c270-4fc2-a033-ff9d012d53c7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a290a9f3-c270-4fc2-a033-ff9d012d53c7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:17:41 crc kubenswrapper[4914]: I0130 21:17:41.185402 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a290a9f3-c270-4fc2-a033-ff9d012d53c7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a290a9f3-c270-4fc2-a033-ff9d012d53c7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:17:41 crc kubenswrapper[4914]: I0130 21:17:41.185491 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a290a9f3-c270-4fc2-a033-ff9d012d53c7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a290a9f3-c270-4fc2-a033-ff9d012d53c7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:17:41 crc kubenswrapper[4914]: I0130 21:17:41.185565 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a290a9f3-c270-4fc2-a033-ff9d012d53c7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a290a9f3-c270-4fc2-a033-ff9d012d53c7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:17:41 crc kubenswrapper[4914]: I0130 21:17:41.228850 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a290a9f3-c270-4fc2-a033-ff9d012d53c7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a290a9f3-c270-4fc2-a033-ff9d012d53c7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:17:41 crc kubenswrapper[4914]: I0130 21:17:41.332987 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:17:41 crc kubenswrapper[4914]: E0130 21:17:41.337814 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 30 21:17:41 crc kubenswrapper[4914]: E0130 21:17:41.337988 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xkdr8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-42klg_openshift-marketplace(d7bb25c2-cc0d-43a1-84ba-9b60c8298acf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:17:41 crc kubenswrapper[4914]: E0130 21:17:41.341296 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-42klg" podUID="d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" Jan 30 21:17:44 crc kubenswrapper[4914]: E0130 21:17:44.571074 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 30 21:17:44 crc kubenswrapper[4914]: E0130 21:17:44.571612 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8bspw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-4bzx9_openshift-marketplace(e8b53784-6398-419a-84b0-65f2550636a5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:17:44 crc kubenswrapper[4914]: E0130 21:17:44.573532 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-4bzx9" podUID="e8b53784-6398-419a-84b0-65f2550636a5" Jan 30 21:17:45 crc kubenswrapper[4914]: I0130 21:17:45.411764 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 21:17:45 crc kubenswrapper[4914]: I0130 21:17:45.412874 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:17:45 crc kubenswrapper[4914]: I0130 21:17:45.424698 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 21:17:45 crc kubenswrapper[4914]: I0130 21:17:45.453275 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3f5bf593-1c26-4d51-a30a-45477c960de6-var-lock\") pod \"installer-9-crc\" (UID: \"3f5bf593-1c26-4d51-a30a-45477c960de6\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:17:45 crc kubenswrapper[4914]: I0130 21:17:45.453356 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f5bf593-1c26-4d51-a30a-45477c960de6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3f5bf593-1c26-4d51-a30a-45477c960de6\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:17:45 crc kubenswrapper[4914]: I0130 21:17:45.453398 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f5bf593-1c26-4d51-a30a-45477c960de6-kube-api-access\") pod \"installer-9-crc\" (UID: \"3f5bf593-1c26-4d51-a30a-45477c960de6\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:17:45 crc kubenswrapper[4914]: I0130 21:17:45.557952 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3f5bf593-1c26-4d51-a30a-45477c960de6-var-lock\") pod \"installer-9-crc\" (UID: \"3f5bf593-1c26-4d51-a30a-45477c960de6\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:17:45 crc kubenswrapper[4914]: I0130 21:17:45.558053 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f5bf593-1c26-4d51-a30a-45477c960de6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3f5bf593-1c26-4d51-a30a-45477c960de6\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:17:45 crc kubenswrapper[4914]: I0130 21:17:45.558088 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f5bf593-1c26-4d51-a30a-45477c960de6-kube-api-access\") pod \"installer-9-crc\" (UID: \"3f5bf593-1c26-4d51-a30a-45477c960de6\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:17:45 crc kubenswrapper[4914]: I0130 21:17:45.558652 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3f5bf593-1c26-4d51-a30a-45477c960de6-var-lock\") pod \"installer-9-crc\" (UID: \"3f5bf593-1c26-4d51-a30a-45477c960de6\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:17:45 crc kubenswrapper[4914]: I0130 21:17:45.558733 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f5bf593-1c26-4d51-a30a-45477c960de6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3f5bf593-1c26-4d51-a30a-45477c960de6\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:17:45 crc kubenswrapper[4914]: I0130 21:17:45.581223 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f5bf593-1c26-4d51-a30a-45477c960de6-kube-api-access\") pod \"installer-9-crc\" (UID: \"3f5bf593-1c26-4d51-a30a-45477c960de6\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:17:45 crc kubenswrapper[4914]: I0130 21:17:45.774610 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:17:46 crc kubenswrapper[4914]: E0130 21:17:46.833333 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-42klg" podUID="d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" Jan 30 21:17:46 crc kubenswrapper[4914]: E0130 21:17:46.880244 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 30 21:17:46 crc kubenswrapper[4914]: E0130 21:17:46.880400 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h8zpd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-76ck9_openshift-marketplace(9d72dac1-62cd-4ab2-bf74-89c95d9762b0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:17:46 crc kubenswrapper[4914]: E0130 21:17:46.881650 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-76ck9" podUID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" Jan 30 21:17:50 crc kubenswrapper[4914]: E0130 21:17:50.887064 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-76ck9" podUID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" Jan 30 21:17:50 crc kubenswrapper[4914]: E0130 21:17:50.887592 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-4bzx9" podUID="e8b53784-6398-419a-84b0-65f2550636a5" Jan 30 21:17:53 crc kubenswrapper[4914]: E0130 21:17:53.006831 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 30 21:17:53 crc kubenswrapper[4914]: E0130 21:17:53.007236 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lngzb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-f8nmx_openshift-marketplace(cfdb54ed-594a-4867-b500-68bdd392ce12): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:17:53 crc kubenswrapper[4914]: E0130 21:17:53.008810 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-f8nmx" podUID="cfdb54ed-594a-4867-b500-68bdd392ce12" Jan 30 21:17:53 crc kubenswrapper[4914]: I0130 21:17:53.218658 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4"] Jan 30 21:17:53 crc kubenswrapper[4914]: W0130 21:17:53.232408 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc0c8ea7_0152_48d1_aff0_bf27f0c43b24.slice/crio-10b9a5c88f7ef4913cb46655f644cb08bb74573203f318d114a7431787e52f0b WatchSource:0}: Error finding container 10b9a5c88f7ef4913cb46655f644cb08bb74573203f318d114a7431787e52f0b: Status 404 returned error can't find the container with id 10b9a5c88f7ef4913cb46655f644cb08bb74573203f318d114a7431787e52f0b Jan 30 21:17:53 crc kubenswrapper[4914]: I0130 21:17:53.259674 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 21:17:53 crc kubenswrapper[4914]: W0130 21:17:53.277734 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3f5bf593_1c26_4d51_a30a_45477c960de6.slice/crio-7e28bc98ac5413785bfbf50b7e6e3aa7fba8d9c21e7b8cb13f0dce3df3f42757 WatchSource:0}: Error finding container 7e28bc98ac5413785bfbf50b7e6e3aa7fba8d9c21e7b8cb13f0dce3df3f42757: Status 404 returned error can't find the container with id 7e28bc98ac5413785bfbf50b7e6e3aa7fba8d9c21e7b8cb13f0dce3df3f42757 Jan 30 21:17:53 crc kubenswrapper[4914]: I0130 21:17:53.368013 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45"] Jan 30 21:17:53 crc kubenswrapper[4914]: I0130 21:17:53.381189 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 21:17:53 crc kubenswrapper[4914]: W0130 21:17:53.393239 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda290a9f3_c270_4fc2_a033_ff9d012d53c7.slice/crio-0b1a85298c62deb0562f1c05cbd5046400934ac54f8ee0cab1e39bce45cd9f15 WatchSource:0}: Error finding container 0b1a85298c62deb0562f1c05cbd5046400934ac54f8ee0cab1e39bce45cd9f15: Status 404 returned error can't find the container with id 0b1a85298c62deb0562f1c05cbd5046400934ac54f8ee0cab1e39bce45cd9f15 Jan 30 21:17:53 crc kubenswrapper[4914]: I0130 21:17:53.427265 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a290a9f3-c270-4fc2-a033-ff9d012d53c7","Type":"ContainerStarted","Data":"0b1a85298c62deb0562f1c05cbd5046400934ac54f8ee0cab1e39bce45cd9f15"} Jan 30 21:17:53 crc kubenswrapper[4914]: I0130 21:17:53.429557 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3f5bf593-1c26-4d51-a30a-45477c960de6","Type":"ContainerStarted","Data":"7e28bc98ac5413785bfbf50b7e6e3aa7fba8d9c21e7b8cb13f0dce3df3f42757"} Jan 30 21:17:53 crc kubenswrapper[4914]: I0130 21:17:53.430797 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" event={"ID":"ed4f5638-f3fb-4c85-bf14-15d149eca55f","Type":"ContainerStarted","Data":"b32159d1039becdb68db37e66cdf38db0bb7893141cd8f0183c3e8f9464b07c9"} Jan 30 21:17:53 crc kubenswrapper[4914]: I0130 21:17:53.432956 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" event={"ID":"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24","Type":"ContainerStarted","Data":"10b9a5c88f7ef4913cb46655f644cb08bb74573203f318d114a7431787e52f0b"} Jan 30 21:17:53 crc kubenswrapper[4914]: E0130 21:17:53.443540 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-f8nmx" podUID="cfdb54ed-594a-4867-b500-68bdd392ce12" Jan 30 21:17:53 crc kubenswrapper[4914]: E0130 21:17:53.890388 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 30 21:17:53 crc kubenswrapper[4914]: E0130 21:17:53.890558 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5w5t2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-z85fs_openshift-marketplace(6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:17:53 crc kubenswrapper[4914]: E0130 21:17:53.891909 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-z85fs" podUID="6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" Jan 30 21:17:54 crc kubenswrapper[4914]: E0130 21:17:54.136102 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 30 21:17:54 crc kubenswrapper[4914]: E0130 21:17:54.136623 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8j7vx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-8vwgc_openshift-marketplace(6e7c4a68-42c2-451f-b4ab-411361c45c63): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:17:54 crc kubenswrapper[4914]: E0130 21:17:54.137782 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-8vwgc" podUID="6e7c4a68-42c2-451f-b4ab-411361c45c63" Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.449571 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3f5bf593-1c26-4d51-a30a-45477c960de6","Type":"ContainerStarted","Data":"8167bc91614a41794472650e9eb72b1cb952d413aa4cd724888bda1536eae6b6"} Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.451097 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" event={"ID":"ed4f5638-f3fb-4c85-bf14-15d149eca55f","Type":"ContainerStarted","Data":"1712eff2fc7941075ba68929f0881781de974f6555ab9f88cde3dfa84882b7c2"} Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.451207 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" podUID="ed4f5638-f3fb-4c85-bf14-15d149eca55f" containerName="route-controller-manager" containerID="cri-o://1712eff2fc7941075ba68929f0881781de974f6555ab9f88cde3dfa84882b7c2" gracePeriod=30 Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.451511 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.452975 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" event={"ID":"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24","Type":"ContainerStarted","Data":"d969b9c64f0dbfe9e0dd6226ee8b20307d634194e7c825c5201618e34264c752"} Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.453026 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" podUID="cc0c8ea7-0152-48d1-aff0-bf27f0c43b24" containerName="controller-manager" containerID="cri-o://d969b9c64f0dbfe9e0dd6226ee8b20307d634194e7c825c5201618e34264c752" gracePeriod=30 Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.453299 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.456359 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.457488 4914 generic.go:334] "Generic (PLEG): container finished" podID="928780fe-51a3-4e38-b573-31145d0a720c" containerID="ef4e722e0c4b749f59efff6fa96fe38c78d4523e3a54aef6bdfd601b0279056d" exitCode=0 Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.457555 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6j7j" event={"ID":"928780fe-51a3-4e38-b573-31145d0a720c","Type":"ContainerDied","Data":"ef4e722e0c4b749f59efff6fa96fe38c78d4523e3a54aef6bdfd601b0279056d"} Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.461944 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a290a9f3-c270-4fc2-a033-ff9d012d53c7","Type":"ContainerStarted","Data":"466c879660a67f0ed8afa2642aace89aac23d158c04cd275c882667b68a54903"} Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.462230 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:17:54 crc kubenswrapper[4914]: E0130 21:17:54.463129 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-8vwgc" podUID="6e7c4a68-42c2-451f-b4ab-411361c45c63" Jan 30 21:17:54 crc kubenswrapper[4914]: E0130 21:17:54.465596 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-z85fs" podUID="6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.470193 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=9.470169827 podStartE2EDuration="9.470169827s" podCreationTimestamp="2026-01-30 21:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:54.462237355 +0000 UTC m=+207.900874116" watchObservedRunningTime="2026-01-30 21:17:54.470169827 +0000 UTC m=+207.908806588" Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.494727 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=14.494698003 podStartE2EDuration="14.494698003s" podCreationTimestamp="2026-01-30 21:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:54.476747925 +0000 UTC m=+207.915384686" watchObservedRunningTime="2026-01-30 21:17:54.494698003 +0000 UTC m=+207.933334764" Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.519089 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" podStartSLOduration=41.519068715 podStartE2EDuration="41.519068715s" podCreationTimestamp="2026-01-30 21:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:54.515239924 +0000 UTC m=+207.953876705" watchObservedRunningTime="2026-01-30 21:17:54.519068715 +0000 UTC m=+207.957705486" Jan 30 21:17:54 crc kubenswrapper[4914]: I0130 21:17:54.531521 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" podStartSLOduration=41.531504039 podStartE2EDuration="41.531504039s" podCreationTimestamp="2026-01-30 21:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:54.530979102 +0000 UTC m=+207.969615863" watchObservedRunningTime="2026-01-30 21:17:54.531504039 +0000 UTC m=+207.970140820" Jan 30 21:17:55 crc kubenswrapper[4914]: I0130 21:17:55.043205 4914 patch_prober.go:28] interesting pod/controller-manager-7f9fb947b7-5j2m4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Jan 30 21:17:55 crc kubenswrapper[4914]: I0130 21:17:55.044082 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" podUID="cc0c8ea7-0152-48d1-aff0-bf27f0c43b24" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Jan 30 21:17:55 crc kubenswrapper[4914]: I0130 21:17:55.469563 4914 generic.go:334] "Generic (PLEG): container finished" podID="ed4f5638-f3fb-4c85-bf14-15d149eca55f" containerID="1712eff2fc7941075ba68929f0881781de974f6555ab9f88cde3dfa84882b7c2" exitCode=0 Jan 30 21:17:55 crc kubenswrapper[4914]: I0130 21:17:55.469730 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" event={"ID":"ed4f5638-f3fb-4c85-bf14-15d149eca55f","Type":"ContainerDied","Data":"1712eff2fc7941075ba68929f0881781de974f6555ab9f88cde3dfa84882b7c2"} Jan 30 21:17:55 crc kubenswrapper[4914]: I0130 21:17:55.471234 4914 generic.go:334] "Generic (PLEG): container finished" podID="a290a9f3-c270-4fc2-a033-ff9d012d53c7" containerID="466c879660a67f0ed8afa2642aace89aac23d158c04cd275c882667b68a54903" exitCode=0 Jan 30 21:17:55 crc kubenswrapper[4914]: I0130 21:17:55.471288 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a290a9f3-c270-4fc2-a033-ff9d012d53c7","Type":"ContainerDied","Data":"466c879660a67f0ed8afa2642aace89aac23d158c04cd275c882667b68a54903"} Jan 30 21:17:56 crc kubenswrapper[4914]: I0130 21:17:56.481004 4914 generic.go:334] "Generic (PLEG): container finished" podID="cc0c8ea7-0152-48d1-aff0-bf27f0c43b24" containerID="d969b9c64f0dbfe9e0dd6226ee8b20307d634194e7c825c5201618e34264c752" exitCode=0 Jan 30 21:17:56 crc kubenswrapper[4914]: I0130 21:17:56.481116 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" event={"ID":"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24","Type":"ContainerDied","Data":"d969b9c64f0dbfe9e0dd6226ee8b20307d634194e7c825c5201618e34264c752"} Jan 30 21:17:56 crc kubenswrapper[4914]: I0130 21:17:56.759658 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:17:56 crc kubenswrapper[4914]: I0130 21:17:56.821835 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a290a9f3-c270-4fc2-a033-ff9d012d53c7-kubelet-dir\") pod \"a290a9f3-c270-4fc2-a033-ff9d012d53c7\" (UID: \"a290a9f3-c270-4fc2-a033-ff9d012d53c7\") " Jan 30 21:17:56 crc kubenswrapper[4914]: I0130 21:17:56.822003 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a290a9f3-c270-4fc2-a033-ff9d012d53c7-kube-api-access\") pod \"a290a9f3-c270-4fc2-a033-ff9d012d53c7\" (UID: \"a290a9f3-c270-4fc2-a033-ff9d012d53c7\") " Jan 30 21:17:56 crc kubenswrapper[4914]: I0130 21:17:56.822921 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a290a9f3-c270-4fc2-a033-ff9d012d53c7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a290a9f3-c270-4fc2-a033-ff9d012d53c7" (UID: "a290a9f3-c270-4fc2-a033-ff9d012d53c7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:17:56 crc kubenswrapper[4914]: I0130 21:17:56.840567 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a290a9f3-c270-4fc2-a033-ff9d012d53c7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a290a9f3-c270-4fc2-a033-ff9d012d53c7" (UID: "a290a9f3-c270-4fc2-a033-ff9d012d53c7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:17:56 crc kubenswrapper[4914]: I0130 21:17:56.922855 4914 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a290a9f3-c270-4fc2-a033-ff9d012d53c7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:56 crc kubenswrapper[4914]: I0130 21:17:56.922888 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a290a9f3-c270-4fc2-a033-ff9d012d53c7-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:56 crc kubenswrapper[4914]: I0130 21:17:56.983680 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:17:56 crc kubenswrapper[4914]: I0130 21:17:56.983785 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:17:56 crc kubenswrapper[4914]: I0130 21:17:56.983844 4914 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:17:56 crc kubenswrapper[4914]: I0130 21:17:56.984550 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4"} pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:17:56 crc kubenswrapper[4914]: I0130 21:17:56.984742 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" containerID="cri-o://435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4" gracePeriod=600 Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.036164 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.125242 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed4f5638-f3fb-4c85-bf14-15d149eca55f-client-ca\") pod \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.125290 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4f5638-f3fb-4c85-bf14-15d149eca55f-config\") pod \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.125363 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc6qs\" (UniqueName: \"kubernetes.io/projected/ed4f5638-f3fb-4c85-bf14-15d149eca55f-kube-api-access-tc6qs\") pod \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.125448 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed4f5638-f3fb-4c85-bf14-15d149eca55f-serving-cert\") pod \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\" (UID: \"ed4f5638-f3fb-4c85-bf14-15d149eca55f\") " Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.126162 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed4f5638-f3fb-4c85-bf14-15d149eca55f-client-ca" (OuterVolumeSpecName: "client-ca") pod "ed4f5638-f3fb-4c85-bf14-15d149eca55f" (UID: "ed4f5638-f3fb-4c85-bf14-15d149eca55f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.126360 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed4f5638-f3fb-4c85-bf14-15d149eca55f-config" (OuterVolumeSpecName: "config") pod "ed4f5638-f3fb-4c85-bf14-15d149eca55f" (UID: "ed4f5638-f3fb-4c85-bf14-15d149eca55f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.228304 4914 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed4f5638-f3fb-4c85-bf14-15d149eca55f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.228362 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4f5638-f3fb-4c85-bf14-15d149eca55f-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.489309 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" event={"ID":"ed4f5638-f3fb-4c85-bf14-15d149eca55f","Type":"ContainerDied","Data":"b32159d1039becdb68db37e66cdf38db0bb7893141cd8f0183c3e8f9464b07c9"} Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.489355 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45" Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.489383 4914 scope.go:117] "RemoveContainer" containerID="1712eff2fc7941075ba68929f0881781de974f6555ab9f88cde3dfa84882b7c2" Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.492858 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a290a9f3-c270-4fc2-a033-ff9d012d53c7","Type":"ContainerDied","Data":"0b1a85298c62deb0562f1c05cbd5046400934ac54f8ee0cab1e39bce45cd9f15"} Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.492899 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b1a85298c62deb0562f1c05cbd5046400934ac54f8ee0cab1e39bce45cd9f15" Jan 30 21:17:57 crc kubenswrapper[4914]: I0130 21:17:57.492992 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:17:58 crc kubenswrapper[4914]: I0130 21:17:58.502595 4914 generic.go:334] "Generic (PLEG): container finished" podID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerID="435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4" exitCode=0 Jan 30 21:17:58 crc kubenswrapper[4914]: I0130 21:17:58.502766 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerDied","Data":"435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4"} Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.677133 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds"] Jan 30 21:17:59 crc kubenswrapper[4914]: E0130 21:17:59.677652 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a290a9f3-c270-4fc2-a033-ff9d012d53c7" containerName="pruner" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.677685 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="a290a9f3-c270-4fc2-a033-ff9d012d53c7" containerName="pruner" Jan 30 21:17:59 crc kubenswrapper[4914]: E0130 21:17:59.677751 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed4f5638-f3fb-4c85-bf14-15d149eca55f" containerName="route-controller-manager" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.677771 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed4f5638-f3fb-4c85-bf14-15d149eca55f" containerName="route-controller-manager" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.678030 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="a290a9f3-c270-4fc2-a033-ff9d012d53c7" containerName="pruner" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.678070 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed4f5638-f3fb-4c85-bf14-15d149eca55f" containerName="route-controller-manager" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.678963 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.697163 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds"] Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.767283 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fc6e283-e750-42c7-b637-9d6c0c678ff7-config\") pod \"route-controller-manager-55659f8bb-4hhds\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.767526 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fc6e283-e750-42c7-b637-9d6c0c678ff7-client-ca\") pod \"route-controller-manager-55659f8bb-4hhds\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.767615 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js949\" (UniqueName: \"kubernetes.io/projected/6fc6e283-e750-42c7-b637-9d6c0c678ff7-kube-api-access-js949\") pod \"route-controller-manager-55659f8bb-4hhds\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.767749 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fc6e283-e750-42c7-b637-9d6c0c678ff7-serving-cert\") pod \"route-controller-manager-55659f8bb-4hhds\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.872481 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fc6e283-e750-42c7-b637-9d6c0c678ff7-config\") pod \"route-controller-manager-55659f8bb-4hhds\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.872690 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fc6e283-e750-42c7-b637-9d6c0c678ff7-client-ca\") pod \"route-controller-manager-55659f8bb-4hhds\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.872809 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js949\" (UniqueName: \"kubernetes.io/projected/6fc6e283-e750-42c7-b637-9d6c0c678ff7-kube-api-access-js949\") pod \"route-controller-manager-55659f8bb-4hhds\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.872884 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fc6e283-e750-42c7-b637-9d6c0c678ff7-serving-cert\") pod \"route-controller-manager-55659f8bb-4hhds\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.878472 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fc6e283-e750-42c7-b637-9d6c0c678ff7-client-ca\") pod \"route-controller-manager-55659f8bb-4hhds\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:17:59 crc kubenswrapper[4914]: I0130 21:17:59.899107 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js949\" (UniqueName: \"kubernetes.io/projected/6fc6e283-e750-42c7-b637-9d6c0c678ff7-kube-api-access-js949\") pod \"route-controller-manager-55659f8bb-4hhds\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:18:00 crc kubenswrapper[4914]: I0130 21:18:00.367662 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fc6e283-e750-42c7-b637-9d6c0c678ff7-config\") pod \"route-controller-manager-55659f8bb-4hhds\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:18:00 crc kubenswrapper[4914]: I0130 21:18:00.369508 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fc6e283-e750-42c7-b637-9d6c0c678ff7-serving-cert\") pod \"route-controller-manager-55659f8bb-4hhds\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:18:00 crc kubenswrapper[4914]: I0130 21:18:00.613433 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:18:04 crc kubenswrapper[4914]: I0130 21:18:04.476417 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed4f5638-f3fb-4c85-bf14-15d149eca55f-kube-api-access-tc6qs" (OuterVolumeSpecName: "kube-api-access-tc6qs") pod "ed4f5638-f3fb-4c85-bf14-15d149eca55f" (UID: "ed4f5638-f3fb-4c85-bf14-15d149eca55f"). InnerVolumeSpecName "kube-api-access-tc6qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:18:04 crc kubenswrapper[4914]: I0130 21:18:04.476941 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed4f5638-f3fb-4c85-bf14-15d149eca55f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ed4f5638-f3fb-4c85-bf14-15d149eca55f" (UID: "ed4f5638-f3fb-4c85-bf14-15d149eca55f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:18:04 crc kubenswrapper[4914]: I0130 21:18:04.543079 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed4f5638-f3fb-4c85-bf14-15d149eca55f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:04 crc kubenswrapper[4914]: I0130 21:18:04.543158 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc6qs\" (UniqueName: \"kubernetes.io/projected/ed4f5638-f3fb-4c85-bf14-15d149eca55f-kube-api-access-tc6qs\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:04 crc kubenswrapper[4914]: I0130 21:18:04.738536 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45"] Jan 30 21:18:04 crc kubenswrapper[4914]: I0130 21:18:04.744555 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f9b495d8-p4r45"] Jan 30 21:18:05 crc kubenswrapper[4914]: I0130 21:18:05.829856 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed4f5638-f3fb-4c85-bf14-15d149eca55f" path="/var/lib/kubelet/pods/ed4f5638-f3fb-4c85-bf14-15d149eca55f/volumes" Jan 30 21:18:06 crc kubenswrapper[4914]: I0130 21:18:06.044620 4914 patch_prober.go:28] interesting pod/controller-manager-7f9fb947b7-5j2m4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 21:18:06 crc kubenswrapper[4914]: I0130 21:18:06.044740 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" podUID="cc0c8ea7-0152-48d1-aff0-bf27f0c43b24" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.484349 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.535842 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-779f5dd757-wvdx6"] Jan 30 21:18:08 crc kubenswrapper[4914]: E0130 21:18:08.536193 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0c8ea7-0152-48d1-aff0-bf27f0c43b24" containerName="controller-manager" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.536221 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0c8ea7-0152-48d1-aff0-bf27f0c43b24" containerName="controller-manager" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.536424 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc0c8ea7-0152-48d1-aff0-bf27f0c43b24" containerName="controller-manager" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.537073 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.543095 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-779f5dd757-wvdx6"] Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.583656 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" event={"ID":"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24","Type":"ContainerDied","Data":"10b9a5c88f7ef4913cb46655f644cb08bb74573203f318d114a7431787e52f0b"} Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.583742 4914 scope.go:117] "RemoveContainer" containerID="d969b9c64f0dbfe9e0dd6226ee8b20307d634194e7c825c5201618e34264c752" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.583835 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.613994 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-proxy-ca-bundles\") pod \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.614219 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-config\") pod \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.614261 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-client-ca\") pod \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.614351 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-serving-cert\") pod \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.614384 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px7s9\" (UniqueName: \"kubernetes.io/projected/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-kube-api-access-px7s9\") pod \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\" (UID: \"cc0c8ea7-0152-48d1-aff0-bf27f0c43b24\") " Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.614624 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-serving-cert\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.614686 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-proxy-ca-bundles\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.614738 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7lbx\" (UniqueName: \"kubernetes.io/projected/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-kube-api-access-n7lbx\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.614893 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-config\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.614959 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-client-ca" (OuterVolumeSpecName: "client-ca") pod "cc0c8ea7-0152-48d1-aff0-bf27f0c43b24" (UID: "cc0c8ea7-0152-48d1-aff0-bf27f0c43b24"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.615065 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-client-ca\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.615126 4914 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.615909 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-config" (OuterVolumeSpecName: "config") pod "cc0c8ea7-0152-48d1-aff0-bf27f0c43b24" (UID: "cc0c8ea7-0152-48d1-aff0-bf27f0c43b24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.616105 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cc0c8ea7-0152-48d1-aff0-bf27f0c43b24" (UID: "cc0c8ea7-0152-48d1-aff0-bf27f0c43b24"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.619946 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-kube-api-access-px7s9" (OuterVolumeSpecName: "kube-api-access-px7s9") pod "cc0c8ea7-0152-48d1-aff0-bf27f0c43b24" (UID: "cc0c8ea7-0152-48d1-aff0-bf27f0c43b24"). InnerVolumeSpecName "kube-api-access-px7s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.622970 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cc0c8ea7-0152-48d1-aff0-bf27f0c43b24" (UID: "cc0c8ea7-0152-48d1-aff0-bf27f0c43b24"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.716606 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-config\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.716920 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-client-ca\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.716974 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-serving-cert\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.717007 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-proxy-ca-bundles\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.717046 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7lbx\" (UniqueName: \"kubernetes.io/projected/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-kube-api-access-n7lbx\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.717161 4914 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.717183 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.717195 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.717207 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px7s9\" (UniqueName: \"kubernetes.io/projected/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24-kube-api-access-px7s9\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.717972 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-client-ca\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.718227 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-config\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.719000 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-proxy-ca-bundles\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.722642 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-serving-cert\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.738968 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7lbx\" (UniqueName: \"kubernetes.io/projected/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-kube-api-access-n7lbx\") pod \"controller-manager-779f5dd757-wvdx6\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.801635 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds"] Jan 30 21:18:08 crc kubenswrapper[4914]: W0130 21:18:08.814540 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fc6e283_e750_42c7_b637_9d6c0c678ff7.slice/crio-6c8463ab3355dac793bbd4c92783c560df09481345c67134bc38860ad78eb9ff WatchSource:0}: Error finding container 6c8463ab3355dac793bbd4c92783c560df09481345c67134bc38860ad78eb9ff: Status 404 returned error can't find the container with id 6c8463ab3355dac793bbd4c92783c560df09481345c67134bc38860ad78eb9ff Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.870514 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.918381 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4"] Jan 30 21:18:08 crc kubenswrapper[4914]: I0130 21:18:08.922765 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f9fb947b7-5j2m4"] Jan 30 21:18:09 crc kubenswrapper[4914]: I0130 21:18:09.596847 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"2df647095348fd109e6817a5b9226907389cb72479ca19ac34e62f6c888f7739"} Jan 30 21:18:09 crc kubenswrapper[4914]: I0130 21:18:09.601173 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" event={"ID":"6fc6e283-e750-42c7-b637-9d6c0c678ff7","Type":"ContainerStarted","Data":"6c8463ab3355dac793bbd4c92783c560df09481345c67134bc38860ad78eb9ff"} Jan 30 21:18:09 crc kubenswrapper[4914]: I0130 21:18:09.603868 4914 generic.go:334] "Generic (PLEG): container finished" podID="cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" containerID="77b46b4b9dbad02d965296f93a07825864882a84a9373cb868c02c71619777f4" exitCode=0 Jan 30 21:18:09 crc kubenswrapper[4914]: I0130 21:18:09.603922 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mchvp" event={"ID":"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964","Type":"ContainerDied","Data":"77b46b4b9dbad02d965296f93a07825864882a84a9373cb868c02c71619777f4"} Jan 30 21:18:09 crc kubenswrapper[4914]: I0130 21:18:09.701856 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-779f5dd757-wvdx6"] Jan 30 21:18:09 crc kubenswrapper[4914]: I0130 21:18:09.832840 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc0c8ea7-0152-48d1-aff0-bf27f0c43b24" path="/var/lib/kubelet/pods/cc0c8ea7-0152-48d1-aff0-bf27f0c43b24/volumes" Jan 30 21:18:09 crc kubenswrapper[4914]: W0130 21:18:09.906235 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4b35b3b_59cf_4678_aa68_9b8b3f106ccb.slice/crio-bd7212e239954001e76b764eb6feb60fb949a6b29c1e4ce763f28d97e3067402 WatchSource:0}: Error finding container bd7212e239954001e76b764eb6feb60fb949a6b29c1e4ce763f28d97e3067402: Status 404 returned error can't find the container with id bd7212e239954001e76b764eb6feb60fb949a6b29c1e4ce763f28d97e3067402 Jan 30 21:18:10 crc kubenswrapper[4914]: I0130 21:18:10.635853 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" event={"ID":"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb","Type":"ContainerStarted","Data":"bd7212e239954001e76b764eb6feb60fb949a6b29c1e4ce763f28d97e3067402"} Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.641898 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6j7j" event={"ID":"928780fe-51a3-4e38-b573-31145d0a720c","Type":"ContainerStarted","Data":"029c3bf08148e737f453184ccdf8133d173a577950ac32f919ac94a83349e057"} Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.643534 4914 generic.go:334] "Generic (PLEG): container finished" podID="cfdb54ed-594a-4867-b500-68bdd392ce12" containerID="cbed8f60ec766e8b254e01c63e988a7a57636a0ee2316fb74de55aaa8e56c01c" exitCode=0 Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.643554 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8nmx" event={"ID":"cfdb54ed-594a-4867-b500-68bdd392ce12","Type":"ContainerDied","Data":"cbed8f60ec766e8b254e01c63e988a7a57636a0ee2316fb74de55aaa8e56c01c"} Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.645974 4914 generic.go:334] "Generic (PLEG): container finished" podID="e8b53784-6398-419a-84b0-65f2550636a5" containerID="48be772e06de6dd15eb25e0b49fd16a1dd34eeaf3eb52ac4770a240031136397" exitCode=0 Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.646032 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4bzx9" event={"ID":"e8b53784-6398-419a-84b0-65f2550636a5","Type":"ContainerDied","Data":"48be772e06de6dd15eb25e0b49fd16a1dd34eeaf3eb52ac4770a240031136397"} Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.649025 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" event={"ID":"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb","Type":"ContainerStarted","Data":"9ae18bc21a2aca04cab51efc6e698de690f056f4177605793cedae1f4edef9f4"} Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.649751 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.651657 4914 generic.go:334] "Generic (PLEG): container finished" podID="d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" containerID="8095bc63239cb55d603b7b78cc833db4a62e3902b4bb8e7c7a9fe7b1e097fbd2" exitCode=0 Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.651757 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42klg" event={"ID":"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf","Type":"ContainerDied","Data":"8095bc63239cb55d603b7b78cc833db4a62e3902b4bb8e7c7a9fe7b1e097fbd2"} Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.655098 4914 generic.go:334] "Generic (PLEG): container finished" podID="6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" containerID="166e18538ff97c4ffa88890caded12503c36477cf5e97e23b97844e85d8a353d" exitCode=0 Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.655176 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z85fs" event={"ID":"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b","Type":"ContainerDied","Data":"166e18538ff97c4ffa88890caded12503c36477cf5e97e23b97844e85d8a353d"} Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.657363 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v6j7j" podStartSLOduration=4.101102792 podStartE2EDuration="1m10.657349485s" podCreationTimestamp="2026-01-30 21:17:01 +0000 UTC" firstStartedPulling="2026-01-30 21:17:02.769614007 +0000 UTC m=+156.208250808" lastFinishedPulling="2026-01-30 21:18:09.32586074 +0000 UTC m=+222.764497501" observedRunningTime="2026-01-30 21:18:11.657008004 +0000 UTC m=+225.095644765" watchObservedRunningTime="2026-01-30 21:18:11.657349485 +0000 UTC m=+225.095986246" Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.658432 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.668730 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mchvp" event={"ID":"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964","Type":"ContainerStarted","Data":"6e665be8f902239f8445c8d30ec4541d7827fe30c1df1363626bf8830ae4d8b2"} Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.670535 4914 generic.go:334] "Generic (PLEG): container finished" podID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" containerID="cc003e570593d3d217b888a5b50d9e9459335432eaae5f30632565aec9370fa4" exitCode=0 Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.670587 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76ck9" event={"ID":"9d72dac1-62cd-4ab2-bf74-89c95d9762b0","Type":"ContainerDied","Data":"cc003e570593d3d217b888a5b50d9e9459335432eaae5f30632565aec9370fa4"} Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.672908 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" event={"ID":"6fc6e283-e750-42c7-b637-9d6c0c678ff7","Type":"ContainerStarted","Data":"6276afb811cd828f8aeef0234f36d19c367cc9abc38add31d18fc82980aca827"} Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.673623 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.676893 4914 generic.go:334] "Generic (PLEG): container finished" podID="6e7c4a68-42c2-451f-b4ab-411361c45c63" containerID="86da6e9aac3a39582835beae1baf5e3dc130b75737c33003caf4bf7e2e18558e" exitCode=0 Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.676939 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8vwgc" event={"ID":"6e7c4a68-42c2-451f-b4ab-411361c45c63","Type":"ContainerDied","Data":"86da6e9aac3a39582835beae1baf5e3dc130b75737c33003caf4bf7e2e18558e"} Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.683051 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.691587 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" podStartSLOduration=38.691572408 podStartE2EDuration="38.691572408s" podCreationTimestamp="2026-01-30 21:17:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:11.688398158 +0000 UTC m=+225.127034919" watchObservedRunningTime="2026-01-30 21:18:11.691572408 +0000 UTC m=+225.130209169" Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.754190 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mchvp" podStartSLOduration=2.746574739 podStartE2EDuration="1m10.75417157s" podCreationTimestamp="2026-01-30 21:17:01 +0000 UTC" firstStartedPulling="2026-01-30 21:17:02.75525562 +0000 UTC m=+156.193892391" lastFinishedPulling="2026-01-30 21:18:10.762852461 +0000 UTC m=+224.201489222" observedRunningTime="2026-01-30 21:18:11.75258288 +0000 UTC m=+225.191219641" watchObservedRunningTime="2026-01-30 21:18:11.75417157 +0000 UTC m=+225.192808331" Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.777684 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.777921 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:18:11 crc kubenswrapper[4914]: I0130 21:18:11.879213 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" podStartSLOduration=38.879197329 podStartE2EDuration="38.879197329s" podCreationTimestamp="2026-01-30 21:17:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:11.877236667 +0000 UTC m=+225.315873428" watchObservedRunningTime="2026-01-30 21:18:11.879197329 +0000 UTC m=+225.317834080" Jan 30 21:18:12 crc kubenswrapper[4914]: I0130 21:18:12.683276 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8nmx" event={"ID":"cfdb54ed-594a-4867-b500-68bdd392ce12","Type":"ContainerStarted","Data":"c6267c12116c170fb3f3e060cfc1e5b3851a14d211f134b34989048f6e57a9d2"} Jan 30 21:18:12 crc kubenswrapper[4914]: I0130 21:18:12.685505 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8vwgc" event={"ID":"6e7c4a68-42c2-451f-b4ab-411361c45c63","Type":"ContainerStarted","Data":"5330b805f7cc923926f44c561d3b37e5ef6bb8ce6c7777bcea85627a1ce4376b"} Jan 30 21:18:12 crc kubenswrapper[4914]: I0130 21:18:12.687828 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4bzx9" event={"ID":"e8b53784-6398-419a-84b0-65f2550636a5","Type":"ContainerStarted","Data":"3a4076445fd86f783fb252a2b1b6e4df818eb9ad47dec86d6534ca6490d229f9"} Jan 30 21:18:12 crc kubenswrapper[4914]: I0130 21:18:12.689501 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76ck9" event={"ID":"9d72dac1-62cd-4ab2-bf74-89c95d9762b0","Type":"ContainerStarted","Data":"3360641ec0456f1db76cd3f0424ed164fb998c7e3f29c9cfa748bb4a2502f415"} Jan 30 21:18:12 crc kubenswrapper[4914]: I0130 21:18:12.691657 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42klg" event={"ID":"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf","Type":"ContainerStarted","Data":"5cc507b6043f61639986c694184303cff546e5f774a95bc245af04a66d96715c"} Jan 30 21:18:12 crc kubenswrapper[4914]: I0130 21:18:12.694165 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z85fs" event={"ID":"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b","Type":"ContainerStarted","Data":"b49d3eb1e53fd4e0b37f8a187a1c4c3771aba2a95b2c9cb26548870333c8e72c"} Jan 30 21:18:12 crc kubenswrapper[4914]: I0130 21:18:12.701904 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f8nmx" podStartSLOduration=2.185187701 podStartE2EDuration="1m10.701883749s" podCreationTimestamp="2026-01-30 21:17:02 +0000 UTC" firstStartedPulling="2026-01-30 21:17:03.858715934 +0000 UTC m=+157.297352695" lastFinishedPulling="2026-01-30 21:18:12.375411982 +0000 UTC m=+225.814048743" observedRunningTime="2026-01-30 21:18:12.69810745 +0000 UTC m=+226.136744211" watchObservedRunningTime="2026-01-30 21:18:12.701883749 +0000 UTC m=+226.140520510" Jan 30 21:18:12 crc kubenswrapper[4914]: I0130 21:18:12.720107 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-42klg" podStartSLOduration=2.254824622 podStartE2EDuration="1m12.720090186s" podCreationTimestamp="2026-01-30 21:17:00 +0000 UTC" firstStartedPulling="2026-01-30 21:17:01.731813101 +0000 UTC m=+155.170449852" lastFinishedPulling="2026-01-30 21:18:12.197078655 +0000 UTC m=+225.635715416" observedRunningTime="2026-01-30 21:18:12.718786315 +0000 UTC m=+226.157423076" watchObservedRunningTime="2026-01-30 21:18:12.720090186 +0000 UTC m=+226.158726947" Jan 30 21:18:12 crc kubenswrapper[4914]: I0130 21:18:12.740475 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z85fs" podStartSLOduration=3.1956539680000002 podStartE2EDuration="1m12.740456651s" podCreationTimestamp="2026-01-30 21:17:00 +0000 UTC" firstStartedPulling="2026-01-30 21:17:02.752247637 +0000 UTC m=+156.190884438" lastFinishedPulling="2026-01-30 21:18:12.29705036 +0000 UTC m=+225.735687121" observedRunningTime="2026-01-30 21:18:12.737713974 +0000 UTC m=+226.176350735" watchObservedRunningTime="2026-01-30 21:18:12.740456651 +0000 UTC m=+226.179093412" Jan 30 21:18:12 crc kubenswrapper[4914]: I0130 21:18:12.760281 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8vwgc" podStartSLOduration=2.517788139 podStartE2EDuration="1m9.760264868s" podCreationTimestamp="2026-01-30 21:17:03 +0000 UTC" firstStartedPulling="2026-01-30 21:17:04.927865579 +0000 UTC m=+158.366502340" lastFinishedPulling="2026-01-30 21:18:12.170342308 +0000 UTC m=+225.608979069" observedRunningTime="2026-01-30 21:18:12.756909472 +0000 UTC m=+226.195546233" watchObservedRunningTime="2026-01-30 21:18:12.760264868 +0000 UTC m=+226.198901629" Jan 30 21:18:12 crc kubenswrapper[4914]: I0130 21:18:12.785205 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4bzx9" podStartSLOduration=2.618162369 podStartE2EDuration="1m9.785183527s" podCreationTimestamp="2026-01-30 21:17:03 +0000 UTC" firstStartedPulling="2026-01-30 21:17:04.916849182 +0000 UTC m=+158.355485943" lastFinishedPulling="2026-01-30 21:18:12.08387034 +0000 UTC m=+225.522507101" observedRunningTime="2026-01-30 21:18:12.781844401 +0000 UTC m=+226.220481162" watchObservedRunningTime="2026-01-30 21:18:12.785183527 +0000 UTC m=+226.223820288" Jan 30 21:18:12 crc kubenswrapper[4914]: I0130 21:18:12.808006 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-76ck9" podStartSLOduration=2.670901605 podStartE2EDuration="1m8.807989589s" podCreationTimestamp="2026-01-30 21:17:04 +0000 UTC" firstStartedPulling="2026-01-30 21:17:05.960316415 +0000 UTC m=+159.398953176" lastFinishedPulling="2026-01-30 21:18:12.097404399 +0000 UTC m=+225.536041160" observedRunningTime="2026-01-30 21:18:12.805877252 +0000 UTC m=+226.244514013" watchObservedRunningTime="2026-01-30 21:18:12.807989589 +0000 UTC m=+226.246626360" Jan 30 21:18:12 crc kubenswrapper[4914]: I0130 21:18:12.958823 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-v6j7j" podUID="928780fe-51a3-4e38-b573-31145d0a720c" containerName="registry-server" probeResult="failure" output=< Jan 30 21:18:12 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:18:12 crc kubenswrapper[4914]: > Jan 30 21:18:13 crc kubenswrapper[4914]: I0130 21:18:13.308043 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:18:13 crc kubenswrapper[4914]: I0130 21:18:13.308093 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:18:13 crc kubenswrapper[4914]: I0130 21:18:13.728787 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:18:13 crc kubenswrapper[4914]: I0130 21:18:13.729316 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:18:14 crc kubenswrapper[4914]: I0130 21:18:14.310466 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:18:14 crc kubenswrapper[4914]: I0130 21:18:14.310838 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:18:14 crc kubenswrapper[4914]: I0130 21:18:14.355996 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-f8nmx" podUID="cfdb54ed-594a-4867-b500-68bdd392ce12" containerName="registry-server" probeResult="failure" output=< Jan 30 21:18:14 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:18:14 crc kubenswrapper[4914]: > Jan 30 21:18:14 crc kubenswrapper[4914]: I0130 21:18:14.741943 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:18:14 crc kubenswrapper[4914]: I0130 21:18:14.743084 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:18:14 crc kubenswrapper[4914]: I0130 21:18:14.785786 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-8vwgc" podUID="6e7c4a68-42c2-451f-b4ab-411361c45c63" containerName="registry-server" probeResult="failure" output=< Jan 30 21:18:14 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:18:14 crc kubenswrapper[4914]: > Jan 30 21:18:15 crc kubenswrapper[4914]: I0130 21:18:15.359581 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4bzx9" podUID="e8b53784-6398-419a-84b0-65f2550636a5" containerName="registry-server" probeResult="failure" output=< Jan 30 21:18:15 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:18:15 crc kubenswrapper[4914]: > Jan 30 21:18:15 crc kubenswrapper[4914]: I0130 21:18:15.786346 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-76ck9" podUID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" containerName="registry-server" probeResult="failure" output=< Jan 30 21:18:15 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:18:15 crc kubenswrapper[4914]: > Jan 30 21:18:21 crc kubenswrapper[4914]: I0130 21:18:21.121835 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:18:21 crc kubenswrapper[4914]: I0130 21:18:21.122190 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:18:21 crc kubenswrapper[4914]: I0130 21:18:21.218333 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:18:21 crc kubenswrapper[4914]: I0130 21:18:21.381232 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:18:21 crc kubenswrapper[4914]: I0130 21:18:21.381276 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:18:21 crc kubenswrapper[4914]: I0130 21:18:21.432341 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:18:21 crc kubenswrapper[4914]: I0130 21:18:21.547859 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:18:21 crc kubenswrapper[4914]: I0130 21:18:21.548611 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:18:21 crc kubenswrapper[4914]: I0130 21:18:21.612097 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:18:21 crc kubenswrapper[4914]: I0130 21:18:21.790985 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:18:21 crc kubenswrapper[4914]: I0130 21:18:21.839338 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:18:21 crc kubenswrapper[4914]: I0130 21:18:21.839677 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:18:21 crc kubenswrapper[4914]: I0130 21:18:21.839881 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:18:21 crc kubenswrapper[4914]: I0130 21:18:21.893191 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:18:22 crc kubenswrapper[4914]: I0130 21:18:22.849215 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mchvp"] Jan 30 21:18:23 crc kubenswrapper[4914]: I0130 21:18:23.373083 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:18:23 crc kubenswrapper[4914]: I0130 21:18:23.418819 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:18:23 crc kubenswrapper[4914]: I0130 21:18:23.766594 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mchvp" podUID="cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" containerName="registry-server" containerID="cri-o://6e665be8f902239f8445c8d30ec4541d7827fe30c1df1363626bf8830ae4d8b2" gracePeriod=2 Jan 30 21:18:23 crc kubenswrapper[4914]: I0130 21:18:23.773313 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:18:23 crc kubenswrapper[4914]: I0130 21:18:23.815444 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:18:24 crc kubenswrapper[4914]: I0130 21:18:24.247498 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v6j7j"] Jan 30 21:18:24 crc kubenswrapper[4914]: I0130 21:18:24.247758 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v6j7j" podUID="928780fe-51a3-4e38-b573-31145d0a720c" containerName="registry-server" containerID="cri-o://029c3bf08148e737f453184ccdf8133d173a577950ac32f919ac94a83349e057" gracePeriod=2 Jan 30 21:18:24 crc kubenswrapper[4914]: I0130 21:18:24.351562 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:18:24 crc kubenswrapper[4914]: I0130 21:18:24.386690 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:18:24 crc kubenswrapper[4914]: I0130 21:18:24.805163 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:18:24 crc kubenswrapper[4914]: I0130 21:18:24.852660 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:18:26 crc kubenswrapper[4914]: I0130 21:18:26.644318 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8vwgc"] Jan 30 21:18:26 crc kubenswrapper[4914]: I0130 21:18:26.644580 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8vwgc" podUID="6e7c4a68-42c2-451f-b4ab-411361c45c63" containerName="registry-server" containerID="cri-o://5330b805f7cc923926f44c561d3b37e5ef6bb8ce6c7777bcea85627a1ce4376b" gracePeriod=2 Jan 30 21:18:26 crc kubenswrapper[4914]: I0130 21:18:26.781803 4914 generic.go:334] "Generic (PLEG): container finished" podID="cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" containerID="6e665be8f902239f8445c8d30ec4541d7827fe30c1df1363626bf8830ae4d8b2" exitCode=0 Jan 30 21:18:26 crc kubenswrapper[4914]: I0130 21:18:26.781842 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mchvp" event={"ID":"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964","Type":"ContainerDied","Data":"6e665be8f902239f8445c8d30ec4541d7827fe30c1df1363626bf8830ae4d8b2"} Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.634171 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.695474 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ncf4\" (UniqueName: \"kubernetes.io/projected/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-kube-api-access-2ncf4\") pod \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\" (UID: \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\") " Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.695538 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-catalog-content\") pod \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\" (UID: \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\") " Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.695559 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-utilities\") pod \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\" (UID: \"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964\") " Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.696470 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-utilities" (OuterVolumeSpecName: "utilities") pod "cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" (UID: "cc57ab23-e2b4-42f7-a4ac-d1cb1871e964"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.704882 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-kube-api-access-2ncf4" (OuterVolumeSpecName: "kube-api-access-2ncf4") pod "cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" (UID: "cc57ab23-e2b4-42f7-a4ac-d1cb1871e964"). InnerVolumeSpecName "kube-api-access-2ncf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.752571 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" (UID: "cc57ab23-e2b4-42f7-a4ac-d1cb1871e964"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.793362 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mchvp" event={"ID":"cc57ab23-e2b4-42f7-a4ac-d1cb1871e964","Type":"ContainerDied","Data":"c90059274740e274654a112979d08ee60eca8dbc1972fe6ba282d5b10514b137"} Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.793407 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mchvp" Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.793430 4914 scope.go:117] "RemoveContainer" containerID="6e665be8f902239f8445c8d30ec4541d7827fe30c1df1363626bf8830ae4d8b2" Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.795928 4914 generic.go:334] "Generic (PLEG): container finished" podID="928780fe-51a3-4e38-b573-31145d0a720c" containerID="029c3bf08148e737f453184ccdf8133d173a577950ac32f919ac94a83349e057" exitCode=0 Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.795973 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6j7j" event={"ID":"928780fe-51a3-4e38-b573-31145d0a720c","Type":"ContainerDied","Data":"029c3bf08148e737f453184ccdf8133d173a577950ac32f919ac94a83349e057"} Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.796507 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ncf4\" (UniqueName: \"kubernetes.io/projected/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-kube-api-access-2ncf4\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.796539 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.796552 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.819303 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mchvp"] Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.820534 4914 scope.go:117] "RemoveContainer" containerID="77b46b4b9dbad02d965296f93a07825864882a84a9373cb868c02c71619777f4" Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.821699 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mchvp"] Jan 30 21:18:28 crc kubenswrapper[4914]: I0130 21:18:28.854443 4914 scope.go:117] "RemoveContainer" containerID="77f00ee02b24543c1fb4cb7c11e07f9c86a780537f191e91a069056a6bb6828a" Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.045815 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-76ck9"] Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.046093 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-76ck9" podUID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" containerName="registry-server" containerID="cri-o://3360641ec0456f1db76cd3f0424ed164fb998c7e3f29c9cfa748bb4a2502f415" gracePeriod=2 Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.368301 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.422319 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbbx6\" (UniqueName: \"kubernetes.io/projected/928780fe-51a3-4e38-b573-31145d0a720c-kube-api-access-xbbx6\") pod \"928780fe-51a3-4e38-b573-31145d0a720c\" (UID: \"928780fe-51a3-4e38-b573-31145d0a720c\") " Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.422432 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/928780fe-51a3-4e38-b573-31145d0a720c-utilities\") pod \"928780fe-51a3-4e38-b573-31145d0a720c\" (UID: \"928780fe-51a3-4e38-b573-31145d0a720c\") " Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.422472 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/928780fe-51a3-4e38-b573-31145d0a720c-catalog-content\") pod \"928780fe-51a3-4e38-b573-31145d0a720c\" (UID: \"928780fe-51a3-4e38-b573-31145d0a720c\") " Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.423172 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/928780fe-51a3-4e38-b573-31145d0a720c-utilities" (OuterVolumeSpecName: "utilities") pod "928780fe-51a3-4e38-b573-31145d0a720c" (UID: "928780fe-51a3-4e38-b573-31145d0a720c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.425905 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928780fe-51a3-4e38-b573-31145d0a720c-kube-api-access-xbbx6" (OuterVolumeSpecName: "kube-api-access-xbbx6") pod "928780fe-51a3-4e38-b573-31145d0a720c" (UID: "928780fe-51a3-4e38-b573-31145d0a720c"). InnerVolumeSpecName "kube-api-access-xbbx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.433947 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/928780fe-51a3-4e38-b573-31145d0a720c-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.433971 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbbx6\" (UniqueName: \"kubernetes.io/projected/928780fe-51a3-4e38-b573-31145d0a720c-kube-api-access-xbbx6\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.473101 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/928780fe-51a3-4e38-b573-31145d0a720c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "928780fe-51a3-4e38-b573-31145d0a720c" (UID: "928780fe-51a3-4e38-b573-31145d0a720c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.535030 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/928780fe-51a3-4e38-b573-31145d0a720c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.804312 4914 generic.go:334] "Generic (PLEG): container finished" podID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" containerID="3360641ec0456f1db76cd3f0424ed164fb998c7e3f29c9cfa748bb4a2502f415" exitCode=0 Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.804391 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76ck9" event={"ID":"9d72dac1-62cd-4ab2-bf74-89c95d9762b0","Type":"ContainerDied","Data":"3360641ec0456f1db76cd3f0424ed164fb998c7e3f29c9cfa748bb4a2502f415"} Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.808429 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v6j7j" event={"ID":"928780fe-51a3-4e38-b573-31145d0a720c","Type":"ContainerDied","Data":"5831f457691f8c908722163d4c98c901c7ab8d4290a21ae09273e8c9aadda7e0"} Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.808476 4914 scope.go:117] "RemoveContainer" containerID="029c3bf08148e737f453184ccdf8133d173a577950ac32f919ac94a83349e057" Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.808520 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v6j7j" Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.811618 4914 generic.go:334] "Generic (PLEG): container finished" podID="6e7c4a68-42c2-451f-b4ab-411361c45c63" containerID="5330b805f7cc923926f44c561d3b37e5ef6bb8ce6c7777bcea85627a1ce4376b" exitCode=0 Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.811671 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8vwgc" event={"ID":"6e7c4a68-42c2-451f-b4ab-411361c45c63","Type":"ContainerDied","Data":"5330b805f7cc923926f44c561d3b37e5ef6bb8ce6c7777bcea85627a1ce4376b"} Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.824139 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" path="/var/lib/kubelet/pods/cc57ab23-e2b4-42f7-a4ac-d1cb1871e964/volumes" Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.828547 4914 scope.go:117] "RemoveContainer" containerID="ef4e722e0c4b749f59efff6fa96fe38c78d4523e3a54aef6bdfd601b0279056d" Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.840000 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v6j7j"] Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.842620 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v6j7j"] Jan 30 21:18:29 crc kubenswrapper[4914]: I0130 21:18:29.851912 4914 scope.go:117] "RemoveContainer" containerID="b3b6fe6dd77a12c107f2db15d939099ad27d73a7bf1ce66c8fee111f29723818" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.407050 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.446097 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e7c4a68-42c2-451f-b4ab-411361c45c63-catalog-content\") pod \"6e7c4a68-42c2-451f-b4ab-411361c45c63\" (UID: \"6e7c4a68-42c2-451f-b4ab-411361c45c63\") " Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.446195 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j7vx\" (UniqueName: \"kubernetes.io/projected/6e7c4a68-42c2-451f-b4ab-411361c45c63-kube-api-access-8j7vx\") pod \"6e7c4a68-42c2-451f-b4ab-411361c45c63\" (UID: \"6e7c4a68-42c2-451f-b4ab-411361c45c63\") " Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.446286 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e7c4a68-42c2-451f-b4ab-411361c45c63-utilities\") pod \"6e7c4a68-42c2-451f-b4ab-411361c45c63\" (UID: \"6e7c4a68-42c2-451f-b4ab-411361c45c63\") " Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.447897 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e7c4a68-42c2-451f-b4ab-411361c45c63-utilities" (OuterVolumeSpecName: "utilities") pod "6e7c4a68-42c2-451f-b4ab-411361c45c63" (UID: "6e7c4a68-42c2-451f-b4ab-411361c45c63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.452208 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e7c4a68-42c2-451f-b4ab-411361c45c63-kube-api-access-8j7vx" (OuterVolumeSpecName: "kube-api-access-8j7vx") pod "6e7c4a68-42c2-451f-b4ab-411361c45c63" (UID: "6e7c4a68-42c2-451f-b4ab-411361c45c63"). InnerVolumeSpecName "kube-api-access-8j7vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.481514 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e7c4a68-42c2-451f-b4ab-411361c45c63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e7c4a68-42c2-451f-b4ab-411361c45c63" (UID: "6e7c4a68-42c2-451f-b4ab-411361c45c63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.549117 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j7vx\" (UniqueName: \"kubernetes.io/projected/6e7c4a68-42c2-451f-b4ab-411361c45c63-kube-api-access-8j7vx\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.549149 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e7c4a68-42c2-451f-b4ab-411361c45c63-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.549159 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e7c4a68-42c2-451f-b4ab-411361c45c63-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.661485 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.751300 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-catalog-content\") pod \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\" (UID: \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\") " Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.751403 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-utilities\") pod \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\" (UID: \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\") " Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.751436 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8zpd\" (UniqueName: \"kubernetes.io/projected/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-kube-api-access-h8zpd\") pod \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\" (UID: \"9d72dac1-62cd-4ab2-bf74-89c95d9762b0\") " Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.752358 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-utilities" (OuterVolumeSpecName: "utilities") pod "9d72dac1-62cd-4ab2-bf74-89c95d9762b0" (UID: "9d72dac1-62cd-4ab2-bf74-89c95d9762b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.755892 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-kube-api-access-h8zpd" (OuterVolumeSpecName: "kube-api-access-h8zpd") pod "9d72dac1-62cd-4ab2-bf74-89c95d9762b0" (UID: "9d72dac1-62cd-4ab2-bf74-89c95d9762b0"). InnerVolumeSpecName "kube-api-access-h8zpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.824941 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8vwgc" event={"ID":"6e7c4a68-42c2-451f-b4ab-411361c45c63","Type":"ContainerDied","Data":"b0a4b2c558f83b2a251b9d900b4c168876383122f57d5625a785ad547d3b31fb"} Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.825292 4914 scope.go:117] "RemoveContainer" containerID="5330b805f7cc923926f44c561d3b37e5ef6bb8ce6c7777bcea85627a1ce4376b" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.824954 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8vwgc" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.828477 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76ck9" event={"ID":"9d72dac1-62cd-4ab2-bf74-89c95d9762b0","Type":"ContainerDied","Data":"36c81e40799bd1b1146d692a8980aa818d74455ebde9a815cb34920f0f3be431"} Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.828502 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-76ck9" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.843237 4914 scope.go:117] "RemoveContainer" containerID="86da6e9aac3a39582835beae1baf5e3dc130b75737c33003caf4bf7e2e18558e" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.853867 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.854384 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8zpd\" (UniqueName: \"kubernetes.io/projected/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-kube-api-access-h8zpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.874938 4914 scope.go:117] "RemoveContainer" containerID="e0543019f162bed939325c9c9581a1ce22733965da6ebdbac22ca576e2450847" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.877685 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8vwgc"] Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.886255 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8vwgc"] Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.892345 4914 scope.go:117] "RemoveContainer" containerID="3360641ec0456f1db76cd3f0424ed164fb998c7e3f29c9cfa748bb4a2502f415" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.903241 4914 scope.go:117] "RemoveContainer" containerID="cc003e570593d3d217b888a5b50d9e9459335432eaae5f30632565aec9370fa4" Jan 30 21:18:30 crc kubenswrapper[4914]: I0130 21:18:30.917746 4914 scope.go:117] "RemoveContainer" containerID="e660781064cef24c7cc43acef415f48a74968c9e7526f08eb68b65e1597bc136" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.101764 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d72dac1-62cd-4ab2-bf74-89c95d9762b0" (UID: "9d72dac1-62cd-4ab2-bf74-89c95d9762b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.159845 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-76ck9"] Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.160380 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d72dac1-62cd-4ab2-bf74-89c95d9762b0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.165839 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-76ck9"] Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455152 4914 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.455399 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7c4a68-42c2-451f-b4ab-411361c45c63" containerName="extract-utilities" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455415 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7c4a68-42c2-451f-b4ab-411361c45c63" containerName="extract-utilities" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.455425 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7c4a68-42c2-451f-b4ab-411361c45c63" containerName="extract-content" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455430 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7c4a68-42c2-451f-b4ab-411361c45c63" containerName="extract-content" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.455442 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="928780fe-51a3-4e38-b573-31145d0a720c" containerName="extract-content" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455466 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="928780fe-51a3-4e38-b573-31145d0a720c" containerName="extract-content" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.455480 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" containerName="registry-server" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455488 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" containerName="registry-server" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.455499 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="928780fe-51a3-4e38-b573-31145d0a720c" containerName="extract-utilities" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455506 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="928780fe-51a3-4e38-b573-31145d0a720c" containerName="extract-utilities" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.455518 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" containerName="extract-utilities" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455525 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" containerName="extract-utilities" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.455534 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" containerName="extract-content" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455541 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" containerName="extract-content" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.455553 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" containerName="registry-server" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455560 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" containerName="registry-server" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.455573 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" containerName="extract-content" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455580 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" containerName="extract-content" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.455591 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" containerName="extract-utilities" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455599 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" containerName="extract-utilities" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.455616 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7c4a68-42c2-451f-b4ab-411361c45c63" containerName="registry-server" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455624 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7c4a68-42c2-451f-b4ab-411361c45c63" containerName="registry-server" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.455636 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="928780fe-51a3-4e38-b573-31145d0a720c" containerName="registry-server" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455642 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="928780fe-51a3-4e38-b573-31145d0a720c" containerName="registry-server" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455766 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" containerName="registry-server" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455784 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc57ab23-e2b4-42f7-a4ac-d1cb1871e964" containerName="registry-server" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455793 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="928780fe-51a3-4e38-b573-31145d0a720c" containerName="registry-server" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.455802 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e7c4a68-42c2-451f-b4ab-411361c45c63" containerName="registry-server" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.456237 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.457116 4914 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.457569 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b" gracePeriod=15 Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.457590 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f" gracePeriod=15 Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.457590 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500" gracePeriod=15 Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.457725 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d" gracePeriod=15 Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.457525 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f" gracePeriod=15 Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.458161 4914 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.458376 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.458392 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.458402 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.458410 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.458420 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.458426 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.458433 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.458438 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.458455 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.458460 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.458469 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.458475 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.458483 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.458490 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.458596 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.458609 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.458617 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.458624 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.458633 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.458821 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.552806 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.565764 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.565819 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.565874 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.565891 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.565909 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.566043 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.566114 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.566162 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.666924 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.666972 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.667003 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.667008 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.667031 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.667050 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.667064 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.667076 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.667104 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.667107 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.667126 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.667148 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.667131 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.667162 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.667195 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.667214 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.803225 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:18:31 crc kubenswrapper[4914]: W0130 21:18:31.818942 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6bd8cdc6b1a752e8edd8b6d196a86fcf749499cd2780f8e17e0383e62c06ef37 WatchSource:0}: Error finding container 6bd8cdc6b1a752e8edd8b6d196a86fcf749499cd2780f8e17e0383e62c06ef37: Status 404 returned error can't find the container with id 6bd8cdc6b1a752e8edd8b6d196a86fcf749499cd2780f8e17e0383e62c06ef37 Jan 30 21:18:31 crc kubenswrapper[4914]: E0130 21:18:31.822223 4914 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.74:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f9eeb40cf81a4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 21:18:31.821738404 +0000 UTC m=+245.260375165,LastTimestamp:2026-01-30 21:18:31.821738404 +0000 UTC m=+245.260375165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.826326 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e7c4a68-42c2-451f-b4ab-411361c45c63" path="/var/lib/kubelet/pods/6e7c4a68-42c2-451f-b4ab-411361c45c63/volumes" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.826946 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="928780fe-51a3-4e38-b573-31145d0a720c" path="/var/lib/kubelet/pods/928780fe-51a3-4e38-b573-31145d0a720c/volumes" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.827627 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d72dac1-62cd-4ab2-bf74-89c95d9762b0" path="/var/lib/kubelet/pods/9d72dac1-62cd-4ab2-bf74-89c95d9762b0/volumes" Jan 30 21:18:31 crc kubenswrapper[4914]: I0130 21:18:31.837529 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6bd8cdc6b1a752e8edd8b6d196a86fcf749499cd2780f8e17e0383e62c06ef37"} Jan 30 21:18:32 crc kubenswrapper[4914]: E0130 21:18:32.514145 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:18:32Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:18:32Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:18:32Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:18:32Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:9bde862635f230b66b73aad05940f6cf2c0555a47fe1db330a20724acca8d497\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:db103f9b4d410efdd30da231ffebe8f093377e6c1e4064ddc68046925eb4627f\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1680805611},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:1be9df9846a1afdcabb94b502538e28b99b6748cc22415f1be58ab4cb7a391b8\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:9f846e202c62c9de285e0af13de8057685dff0d285709f110f88725e10d32d82\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1202160358},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:6c67da62006777d55b7b080cc6b573a07268339208c947747f80c7f859e9e112\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:f211463fbc7e28de13d62907777cc83cd04733b610222bf93c1572489babc933\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1187264246},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:420326d8488ceff2cde22ad8b85d739b0c254d47e703f7ddb1f08f77a48816a6\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:54817da328fa589491a3acbe80acdd88c0830dcc63aaafc08c3539925a1a3b03\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1180692192},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:32 crc kubenswrapper[4914]: E0130 21:18:32.514906 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:32 crc kubenswrapper[4914]: E0130 21:18:32.515498 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:32 crc kubenswrapper[4914]: E0130 21:18:32.515930 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:32 crc kubenswrapper[4914]: E0130 21:18:32.516298 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:32 crc kubenswrapper[4914]: E0130 21:18:32.516337 4914 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:18:32 crc kubenswrapper[4914]: I0130 21:18:32.844957 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 21:18:32 crc kubenswrapper[4914]: I0130 21:18:32.846984 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 21:18:32 crc kubenswrapper[4914]: I0130 21:18:32.847570 4914 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500" exitCode=0 Jan 30 21:18:32 crc kubenswrapper[4914]: I0130 21:18:32.847601 4914 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f" exitCode=0 Jan 30 21:18:32 crc kubenswrapper[4914]: I0130 21:18:32.847611 4914 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b" exitCode=0 Jan 30 21:18:32 crc kubenswrapper[4914]: I0130 21:18:32.847620 4914 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d" exitCode=2 Jan 30 21:18:32 crc kubenswrapper[4914]: I0130 21:18:32.847656 4914 scope.go:117] "RemoveContainer" containerID="184ba330fc1ba783bdb83ba11ca05898753fcd13dff14b1cd47b26beea0b3f2b" Jan 30 21:18:32 crc kubenswrapper[4914]: I0130 21:18:32.848760 4914 generic.go:334] "Generic (PLEG): container finished" podID="3f5bf593-1c26-4d51-a30a-45477c960de6" containerID="8167bc91614a41794472650e9eb72b1cb952d413aa4cd724888bda1536eae6b6" exitCode=0 Jan 30 21:18:32 crc kubenswrapper[4914]: I0130 21:18:32.848807 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3f5bf593-1c26-4d51-a30a-45477c960de6","Type":"ContainerDied","Data":"8167bc91614a41794472650e9eb72b1cb952d413aa4cd724888bda1536eae6b6"} Jan 30 21:18:32 crc kubenswrapper[4914]: I0130 21:18:32.849273 4914 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:32 crc kubenswrapper[4914]: I0130 21:18:32.849422 4914 status_manager.go:851] "Failed to get status for pod" podUID="3f5bf593-1c26-4d51-a30a-45477c960de6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:32 crc kubenswrapper[4914]: I0130 21:18:32.849842 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"93f64017c71f5785ddba05a9a764647e6eff82ebfc1ac8440b45ff9d0b414268"} Jan 30 21:18:32 crc kubenswrapper[4914]: I0130 21:18:32.850519 4914 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:32 crc kubenswrapper[4914]: I0130 21:18:32.850917 4914 status_manager.go:851] "Failed to get status for pod" podUID="3f5bf593-1c26-4d51-a30a-45477c960de6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:33 crc kubenswrapper[4914]: I0130 21:18:33.867114 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.192609 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.193801 4914 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.194350 4914 status_manager.go:851] "Failed to get status for pod" podUID="3f5bf593-1c26-4d51-a30a-45477c960de6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.316404 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f5bf593-1c26-4d51-a30a-45477c960de6-kube-api-access\") pod \"3f5bf593-1c26-4d51-a30a-45477c960de6\" (UID: \"3f5bf593-1c26-4d51-a30a-45477c960de6\") " Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.316478 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f5bf593-1c26-4d51-a30a-45477c960de6-kubelet-dir\") pod \"3f5bf593-1c26-4d51-a30a-45477c960de6\" (UID: \"3f5bf593-1c26-4d51-a30a-45477c960de6\") " Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.316622 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3f5bf593-1c26-4d51-a30a-45477c960de6-var-lock\") pod \"3f5bf593-1c26-4d51-a30a-45477c960de6\" (UID: \"3f5bf593-1c26-4d51-a30a-45477c960de6\") " Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.317298 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f5bf593-1c26-4d51-a30a-45477c960de6-var-lock" (OuterVolumeSpecName: "var-lock") pod "3f5bf593-1c26-4d51-a30a-45477c960de6" (UID: "3f5bf593-1c26-4d51-a30a-45477c960de6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.317368 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f5bf593-1c26-4d51-a30a-45477c960de6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3f5bf593-1c26-4d51-a30a-45477c960de6" (UID: "3f5bf593-1c26-4d51-a30a-45477c960de6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.334215 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f5bf593-1c26-4d51-a30a-45477c960de6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3f5bf593-1c26-4d51-a30a-45477c960de6" (UID: "3f5bf593-1c26-4d51-a30a-45477c960de6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.417998 4914 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3f5bf593-1c26-4d51-a30a-45477c960de6-var-lock\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.418271 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f5bf593-1c26-4d51-a30a-45477c960de6-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.418281 4914 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f5bf593-1c26-4d51-a30a-45477c960de6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.878937 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3f5bf593-1c26-4d51-a30a-45477c960de6","Type":"ContainerDied","Data":"7e28bc98ac5413785bfbf50b7e6e3aa7fba8d9c21e7b8cb13f0dce3df3f42757"} Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.878985 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e28bc98ac5413785bfbf50b7e6e3aa7fba8d9c21e7b8cb13f0dce3df3f42757" Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.879006 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.882030 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.882857 4914 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f" exitCode=0 Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.902317 4914 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:34 crc kubenswrapper[4914]: I0130 21:18:34.902940 4914 status_manager.go:851] "Failed to get status for pod" podUID="3f5bf593-1c26-4d51-a30a-45477c960de6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.028817 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.029841 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.030604 4914 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.031106 4914 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.031598 4914 status_manager.go:851] "Failed to get status for pod" podUID="3f5bf593-1c26-4d51-a30a-45477c960de6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.158649 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.158737 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.158774 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.158826 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.158883 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.158970 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.159274 4914 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.159293 4914 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.159305 4914 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.900637 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.902216 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.902335 4914 scope.go:117] "RemoveContainer" containerID="8a875ca155a9cf98a8a36d93fa3cd8c7d8e977332d56f50f5e2259c5ebd0f500" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.925356 4914 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.925808 4914 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.926433 4914 scope.go:117] "RemoveContainer" containerID="84a11a6fb7acf9690a8eaf0d84b10dec1ac202768d8fa954b8b10c40f97ae28f" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.926460 4914 status_manager.go:851] "Failed to get status for pod" podUID="3f5bf593-1c26-4d51-a30a-45477c960de6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.947310 4914 scope.go:117] "RemoveContainer" containerID="c247fd585ef929ca74ad7bd8d80023a689164b0df5cb7d12a7ec66ac86ad4e2b" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.969256 4914 scope.go:117] "RemoveContainer" containerID="3b5ed212025e50ec71f962905c234873fdc67b25b616033dd0182cd579ee708d" Jan 30 21:18:36 crc kubenswrapper[4914]: I0130 21:18:36.995137 4914 scope.go:117] "RemoveContainer" containerID="853e5eb0325c416f1728774b20c8f15f19ad6a077e02315113a92f081d30333f" Jan 30 21:18:37 crc kubenswrapper[4914]: I0130 21:18:37.022574 4914 scope.go:117] "RemoveContainer" containerID="6fead3ed0b2a951b4801d7e658a0d216632797304cd83248c54b7ebe72f7ef0b" Jan 30 21:18:37 crc kubenswrapper[4914]: I0130 21:18:37.823330 4914 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:37 crc kubenswrapper[4914]: I0130 21:18:37.823884 4914 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:37 crc kubenswrapper[4914]: I0130 21:18:37.824378 4914 status_manager.go:851] "Failed to get status for pod" podUID="3f5bf593-1c26-4d51-a30a-45477c960de6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:37 crc kubenswrapper[4914]: I0130 21:18:37.828045 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 30 21:18:38 crc kubenswrapper[4914]: E0130 21:18:38.182131 4914 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:38 crc kubenswrapper[4914]: E0130 21:18:38.182788 4914 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:38 crc kubenswrapper[4914]: E0130 21:18:38.183364 4914 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:38 crc kubenswrapper[4914]: E0130 21:18:38.184036 4914 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:38 crc kubenswrapper[4914]: E0130 21:18:38.184557 4914 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:38 crc kubenswrapper[4914]: I0130 21:18:38.184624 4914 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 30 21:18:38 crc kubenswrapper[4914]: E0130 21:18:38.185199 4914 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="200ms" Jan 30 21:18:38 crc kubenswrapper[4914]: E0130 21:18:38.391160 4914 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="400ms" Jan 30 21:18:38 crc kubenswrapper[4914]: E0130 21:18:38.565823 4914 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.74:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f9eeb40cf81a4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 21:18:31.821738404 +0000 UTC m=+245.260375165,LastTimestamp:2026-01-30 21:18:31.821738404 +0000 UTC m=+245.260375165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 21:18:38 crc kubenswrapper[4914]: E0130 21:18:38.792454 4914 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="800ms" Jan 30 21:18:39 crc kubenswrapper[4914]: E0130 21:18:39.593229 4914 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="1.6s" Jan 30 21:18:41 crc kubenswrapper[4914]: E0130 21:18:41.194803 4914 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" interval="3.2s" Jan 30 21:18:42 crc kubenswrapper[4914]: E0130 21:18:42.741532 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:18:42Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:18:42Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:18:42Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:18:42Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:9bde862635f230b66b73aad05940f6cf2c0555a47fe1db330a20724acca8d497\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:db103f9b4d410efdd30da231ffebe8f093377e6c1e4064ddc68046925eb4627f\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1680805611},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:1be9df9846a1afdcabb94b502538e28b99b6748cc22415f1be58ab4cb7a391b8\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:9f846e202c62c9de285e0af13de8057685dff0d285709f110f88725e10d32d82\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1202160358},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:6c67da62006777d55b7b080cc6b573a07268339208c947747f80c7f859e9e112\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:f211463fbc7e28de13d62907777cc83cd04733b610222bf93c1572489babc933\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1187264246},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:420326d8488ceff2cde22ad8b85d739b0c254d47e703f7ddb1f08f77a48816a6\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:54817da328fa589491a3acbe80acdd88c0830dcc63aaafc08c3539925a1a3b03\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1180692192},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:42 crc kubenswrapper[4914]: E0130 21:18:42.742770 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:42 crc kubenswrapper[4914]: E0130 21:18:42.743297 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:42 crc kubenswrapper[4914]: E0130 21:18:42.743598 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:42 crc kubenswrapper[4914]: E0130 21:18:42.743963 4914 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:42 crc kubenswrapper[4914]: E0130 21:18:42.743997 4914 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:18:42 crc kubenswrapper[4914]: I0130 21:18:42.817058 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:42 crc kubenswrapper[4914]: I0130 21:18:42.821382 4914 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:42 crc kubenswrapper[4914]: I0130 21:18:42.821695 4914 status_manager.go:851] "Failed to get status for pod" podUID="3f5bf593-1c26-4d51-a30a-45477c960de6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:42 crc kubenswrapper[4914]: I0130 21:18:42.849868 4914 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9" Jan 30 21:18:42 crc kubenswrapper[4914]: I0130 21:18:42.849917 4914 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9" Jan 30 21:18:42 crc kubenswrapper[4914]: E0130 21:18:42.850596 4914 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:42 crc kubenswrapper[4914]: I0130 21:18:42.851495 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:42 crc kubenswrapper[4914]: I0130 21:18:42.945191 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"37a6384dc258db38859fcafb9b64376df5bd96e493351d5216f7c27a4143a0e3"} Jan 30 21:18:43 crc kubenswrapper[4914]: I0130 21:18:43.956067 4914 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="e73f7dbf44f213d960bfeeb3ab384eda67bb61ac440dcb7c97b8f4ad1aeb260f" exitCode=0 Jan 30 21:18:43 crc kubenswrapper[4914]: I0130 21:18:43.956198 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"e73f7dbf44f213d960bfeeb3ab384eda67bb61ac440dcb7c97b8f4ad1aeb260f"} Jan 30 21:18:43 crc kubenswrapper[4914]: I0130 21:18:43.956577 4914 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9" Jan 30 21:18:43 crc kubenswrapper[4914]: I0130 21:18:43.956617 4914 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9" Jan 30 21:18:43 crc kubenswrapper[4914]: E0130 21:18:43.957462 4914 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:43 crc kubenswrapper[4914]: I0130 21:18:43.957506 4914 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:43 crc kubenswrapper[4914]: I0130 21:18:43.957994 4914 status_manager.go:851] "Failed to get status for pod" podUID="3f5bf593-1c26-4d51-a30a-45477c960de6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.74:6443: connect: connection refused" Jan 30 21:18:44 crc kubenswrapper[4914]: I0130 21:18:44.964802 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1aa1e3d53c0605fffbac6dda21734877b9a3d45d36fafa2ffc300b1cfbcca87b"} Jan 30 21:18:44 crc kubenswrapper[4914]: I0130 21:18:44.965403 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fdf1efcbe8a676912d6cca4b4b4df7cf8c0859abf0fab1aacdcacdcbe4bc4e08"} Jan 30 21:18:44 crc kubenswrapper[4914]: I0130 21:18:44.965421 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5dbd2f05a0c674296bcfde5dc252068a8ce60fb58c8f1b62c05534e875656de6"} Jan 30 21:18:44 crc kubenswrapper[4914]: I0130 21:18:44.965433 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"51b2ae7bf1cb76aac56b2581f2617fedfea77636f6be2e088b8dc8ea038e043d"} Jan 30 21:18:45 crc kubenswrapper[4914]: I0130 21:18:45.975672 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 30 21:18:45 crc kubenswrapper[4914]: I0130 21:18:45.975834 4914 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a" exitCode=1 Jan 30 21:18:45 crc kubenswrapper[4914]: I0130 21:18:45.975940 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a"} Jan 30 21:18:45 crc kubenswrapper[4914]: I0130 21:18:45.977074 4914 scope.go:117] "RemoveContainer" containerID="b92636aa5fb96f6483f77965ea9b5c32c814eaff9aab7abfb03f3d97c86f838a" Jan 30 21:18:45 crc kubenswrapper[4914]: I0130 21:18:45.981148 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0c01b449f7325f27e374bc1482da6719682733228693f151229979fab56e6395"} Jan 30 21:18:45 crc kubenswrapper[4914]: I0130 21:18:45.981432 4914 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9" Jan 30 21:18:45 crc kubenswrapper[4914]: I0130 21:18:45.981454 4914 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9" Jan 30 21:18:45 crc kubenswrapper[4914]: I0130 21:18:45.981631 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:46 crc kubenswrapper[4914]: I0130 21:18:46.994312 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 30 21:18:46 crc kubenswrapper[4914]: I0130 21:18:46.994647 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e8009a9f8a41ee13dd15e03dd233e7362a44d243564c9266dc92dabc3e03c6b6"} Jan 30 21:18:47 crc kubenswrapper[4914]: I0130 21:18:47.852246 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:47 crc kubenswrapper[4914]: I0130 21:18:47.852385 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:47 crc kubenswrapper[4914]: I0130 21:18:47.858692 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:50 crc kubenswrapper[4914]: I0130 21:18:50.999585 4914 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:51 crc kubenswrapper[4914]: I0130 21:18:51.034365 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:18:51 crc kubenswrapper[4914]: I0130 21:18:51.072954 4914 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d76d1dfc-9760-4870-b3d4-7d045b8380ec" Jan 30 21:18:52 crc kubenswrapper[4914]: I0130 21:18:52.024806 4914 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9" Jan 30 21:18:52 crc kubenswrapper[4914]: I0130 21:18:52.025196 4914 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9" Jan 30 21:18:52 crc kubenswrapper[4914]: I0130 21:18:52.028968 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:18:52 crc kubenswrapper[4914]: I0130 21:18:52.029525 4914 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d76d1dfc-9760-4870-b3d4-7d045b8380ec" Jan 30 21:18:53 crc kubenswrapper[4914]: I0130 21:18:53.034076 4914 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9" Jan 30 21:18:53 crc kubenswrapper[4914]: I0130 21:18:53.034134 4914 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8bfaacc5-ca3c-466f-a77d-efe8ce0ac0a9" Jan 30 21:18:53 crc kubenswrapper[4914]: I0130 21:18:53.038861 4914 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d76d1dfc-9760-4870-b3d4-7d045b8380ec" Jan 30 21:18:55 crc kubenswrapper[4914]: I0130 21:18:55.617130 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:18:55 crc kubenswrapper[4914]: I0130 21:18:55.621863 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:18:56 crc kubenswrapper[4914]: I0130 21:18:56.056591 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:18:58 crc kubenswrapper[4914]: I0130 21:18:58.948293 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 30 21:19:01 crc kubenswrapper[4914]: I0130 21:19:01.218153 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 30 21:19:01 crc kubenswrapper[4914]: I0130 21:19:01.526427 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 30 21:19:01 crc kubenswrapper[4914]: I0130 21:19:01.575996 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 30 21:19:01 crc kubenswrapper[4914]: I0130 21:19:01.964851 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 30 21:19:02 crc kubenswrapper[4914]: I0130 21:19:02.112384 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 21:19:02 crc kubenswrapper[4914]: I0130 21:19:02.530049 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 30 21:19:02 crc kubenswrapper[4914]: I0130 21:19:02.768094 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 30 21:19:03 crc kubenswrapper[4914]: I0130 21:19:03.007587 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 30 21:19:03 crc kubenswrapper[4914]: I0130 21:19:03.204681 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 30 21:19:03 crc kubenswrapper[4914]: I0130 21:19:03.332662 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 30 21:19:03 crc kubenswrapper[4914]: I0130 21:19:03.354051 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 30 21:19:03 crc kubenswrapper[4914]: I0130 21:19:03.401913 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 21:19:03 crc kubenswrapper[4914]: I0130 21:19:03.500608 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 30 21:19:03 crc kubenswrapper[4914]: I0130 21:19:03.762919 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 30 21:19:03 crc kubenswrapper[4914]: I0130 21:19:03.780809 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 30 21:19:03 crc kubenswrapper[4914]: I0130 21:19:03.929112 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 30 21:19:04 crc kubenswrapper[4914]: I0130 21:19:04.093249 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 30 21:19:04 crc kubenswrapper[4914]: I0130 21:19:04.394900 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 30 21:19:04 crc kubenswrapper[4914]: I0130 21:19:04.417123 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 30 21:19:04 crc kubenswrapper[4914]: I0130 21:19:04.428538 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 30 21:19:04 crc kubenswrapper[4914]: I0130 21:19:04.438579 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 30 21:19:04 crc kubenswrapper[4914]: I0130 21:19:04.536949 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 30 21:19:04 crc kubenswrapper[4914]: I0130 21:19:04.687938 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 30 21:19:04 crc kubenswrapper[4914]: I0130 21:19:04.688562 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 30 21:19:04 crc kubenswrapper[4914]: I0130 21:19:04.718170 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 30 21:19:04 crc kubenswrapper[4914]: I0130 21:19:04.759113 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 30 21:19:04 crc kubenswrapper[4914]: I0130 21:19:04.947582 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.038316 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.193609 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.425588 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.445587 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.539168 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.590018 4914 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.645867 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.707310 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.810921 4914 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.811929 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=34.811903704 podStartE2EDuration="34.811903704s" podCreationTimestamp="2026-01-30 21:18:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:51.014780054 +0000 UTC m=+264.453416835" watchObservedRunningTime="2026-01-30 21:19:05.811903704 +0000 UTC m=+279.250540505" Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.830850 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.830911 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.839055 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.870251 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.870219046 podStartE2EDuration="14.870219046s" podCreationTimestamp="2026-01-30 21:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:19:05.861682568 +0000 UTC m=+279.300319369" watchObservedRunningTime="2026-01-30 21:19:05.870219046 +0000 UTC m=+279.308855857" Jan 30 21:19:05 crc kubenswrapper[4914]: I0130 21:19:05.948698 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.016580 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.033817 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.048317 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.239340 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.310468 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.311551 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.322624 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.507592 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.526129 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.576848 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.629584 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.821414 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.903882 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.922441 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.927865 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 30 21:19:06 crc kubenswrapper[4914]: I0130 21:19:06.938427 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.061989 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.150057 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.187197 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.239273 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.247935 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.280446 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.399089 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.486567 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.549996 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.600504 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.619749 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.632471 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.650812 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.776138 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.784135 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.806587 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.826888 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.830236 4914 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.848240 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.885185 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 30 21:19:07 crc kubenswrapper[4914]: I0130 21:19:07.964040 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.024347 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.087489 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.192021 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.205066 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.213845 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.224258 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.227083 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.252627 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.259054 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.285310 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.293863 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.351946 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.352860 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.442593 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.483939 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.496052 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.612141 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.653612 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.885208 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.888397 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.941176 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.943679 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 30 21:19:08 crc kubenswrapper[4914]: I0130 21:19:08.983006 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 30 21:19:09 crc kubenswrapper[4914]: I0130 21:19:09.052666 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 30 21:19:09 crc kubenswrapper[4914]: I0130 21:19:09.106316 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 30 21:19:09 crc kubenswrapper[4914]: I0130 21:19:09.221631 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 30 21:19:09 crc kubenswrapper[4914]: I0130 21:19:09.255787 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 30 21:19:09 crc kubenswrapper[4914]: I0130 21:19:09.292959 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 30 21:19:09 crc kubenswrapper[4914]: I0130 21:19:09.429606 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 30 21:19:09 crc kubenswrapper[4914]: I0130 21:19:09.458395 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 30 21:19:09 crc kubenswrapper[4914]: I0130 21:19:09.630262 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 30 21:19:09 crc kubenswrapper[4914]: I0130 21:19:09.897962 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 30 21:19:09 crc kubenswrapper[4914]: I0130 21:19:09.901193 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 21:19:09 crc kubenswrapper[4914]: I0130 21:19:09.956194 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 30 21:19:09 crc kubenswrapper[4914]: I0130 21:19:09.992241 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.042373 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.059855 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.075275 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.116911 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.217676 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.267823 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.276397 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.514810 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.588350 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.602248 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.674175 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.712551 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.772995 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.776923 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.872104 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 21:19:10 crc kubenswrapper[4914]: I0130 21:19:10.926328 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.016983 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.058437 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.084317 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.129346 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.129772 4914 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.220205 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.346003 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.415899 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.431441 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.461008 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.474752 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.504954 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.509108 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.515417 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.520107 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.674090 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.698437 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.853998 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 30 21:19:11 crc kubenswrapper[4914]: I0130 21:19:11.891107 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.014681 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.044218 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.048887 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.127238 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.134220 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.264942 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.326565 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.373009 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.375440 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.394799 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.396199 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.401610 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.417946 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.438747 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.629166 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.653244 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.673890 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.708768 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.892008 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.908461 4914 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.926123 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.970335 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 30 21:19:12 crc kubenswrapper[4914]: I0130 21:19:12.982874 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.020918 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.088800 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.155535 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.178809 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.198483 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.221014 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.338152 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.351904 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.358550 4914 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.359014 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://93f64017c71f5785ddba05a9a764647e6eff82ebfc1ac8440b45ff9d0b414268" gracePeriod=5 Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.436644 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.452934 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.467483 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.468852 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds"] Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.469113 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" podUID="6fc6e283-e750-42c7-b637-9d6c0c678ff7" containerName="route-controller-manager" containerID="cri-o://6276afb811cd828f8aeef0234f36d19c367cc9abc38add31d18fc82980aca827" gracePeriod=30 Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.515385 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.522456 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.552413 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-779f5dd757-wvdx6"] Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.552703 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" podUID="d4b35b3b-59cf-4678-aa68-9b8b3f106ccb" containerName="controller-manager" containerID="cri-o://9ae18bc21a2aca04cab51efc6e698de690f056f4177605793cedae1f4edef9f4" gracePeriod=30 Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.711556 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.722466 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.847961 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.914026 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js949\" (UniqueName: \"kubernetes.io/projected/6fc6e283-e750-42c7-b637-9d6c0c678ff7-kube-api-access-js949\") pod \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.914069 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fc6e283-e750-42c7-b637-9d6c0c678ff7-config\") pod \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.915577 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fc6e283-e750-42c7-b637-9d6c0c678ff7-config" (OuterVolumeSpecName: "config") pod "6fc6e283-e750-42c7-b637-9d6c0c678ff7" (UID: "6fc6e283-e750-42c7-b637-9d6c0c678ff7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.920464 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fc6e283-e750-42c7-b637-9d6c0c678ff7-kube-api-access-js949" (OuterVolumeSpecName: "kube-api-access-js949") pod "6fc6e283-e750-42c7-b637-9d6c0c678ff7" (UID: "6fc6e283-e750-42c7-b637-9d6c0c678ff7"). InnerVolumeSpecName "kube-api-access-js949". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.924699 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.952762 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.971063 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 30 21:19:13 crc kubenswrapper[4914]: I0130 21:19:13.999551 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.014622 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fc6e283-e750-42c7-b637-9d6c0c678ff7-client-ca\") pod \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.014666 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fc6e283-e750-42c7-b637-9d6c0c678ff7-serving-cert\") pod \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\" (UID: \"6fc6e283-e750-42c7-b637-9d6c0c678ff7\") " Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.014894 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js949\" (UniqueName: \"kubernetes.io/projected/6fc6e283-e750-42c7-b637-9d6c0c678ff7-kube-api-access-js949\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.014910 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fc6e283-e750-42c7-b637-9d6c0c678ff7-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.016032 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fc6e283-e750-42c7-b637-9d6c0c678ff7-client-ca" (OuterVolumeSpecName: "client-ca") pod "6fc6e283-e750-42c7-b637-9d6c0c678ff7" (UID: "6fc6e283-e750-42c7-b637-9d6c0c678ff7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.018100 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fc6e283-e750-42c7-b637-9d6c0c678ff7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6fc6e283-e750-42c7-b637-9d6c0c678ff7" (UID: "6fc6e283-e750-42c7-b637-9d6c0c678ff7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.115794 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-client-ca\") pod \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.115958 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-proxy-ca-bundles\") pod \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.116058 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7lbx\" (UniqueName: \"kubernetes.io/projected/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-kube-api-access-n7lbx\") pod \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.116185 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-config\") pod \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.116223 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-serving-cert\") pod \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\" (UID: \"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb\") " Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.116666 4914 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fc6e283-e750-42c7-b637-9d6c0c678ff7-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.116777 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fc6e283-e750-42c7-b637-9d6c0c678ff7-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.118692 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-client-ca" (OuterVolumeSpecName: "client-ca") pod "d4b35b3b-59cf-4678-aa68-9b8b3f106ccb" (UID: "d4b35b3b-59cf-4678-aa68-9b8b3f106ccb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.119749 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d4b35b3b-59cf-4678-aa68-9b8b3f106ccb" (UID: "d4b35b3b-59cf-4678-aa68-9b8b3f106ccb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.120423 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d4b35b3b-59cf-4678-aa68-9b8b3f106ccb" (UID: "d4b35b3b-59cf-4678-aa68-9b8b3f106ccb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.121043 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-config" (OuterVolumeSpecName: "config") pod "d4b35b3b-59cf-4678-aa68-9b8b3f106ccb" (UID: "d4b35b3b-59cf-4678-aa68-9b8b3f106ccb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.123546 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-kube-api-access-n7lbx" (OuterVolumeSpecName: "kube-api-access-n7lbx") pod "d4b35b3b-59cf-4678-aa68-9b8b3f106ccb" (UID: "d4b35b3b-59cf-4678-aa68-9b8b3f106ccb"). InnerVolumeSpecName "kube-api-access-n7lbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.217965 4914 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.218014 4914 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.218038 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7lbx\" (UniqueName: \"kubernetes.io/projected/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-kube-api-access-n7lbx\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.218077 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.218096 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.235838 4914 generic.go:334] "Generic (PLEG): container finished" podID="d4b35b3b-59cf-4678-aa68-9b8b3f106ccb" containerID="9ae18bc21a2aca04cab51efc6e698de690f056f4177605793cedae1f4edef9f4" exitCode=0 Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.235977 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.236600 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" event={"ID":"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb","Type":"ContainerDied","Data":"9ae18bc21a2aca04cab51efc6e698de690f056f4177605793cedae1f4edef9f4"} Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.236674 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-779f5dd757-wvdx6" event={"ID":"d4b35b3b-59cf-4678-aa68-9b8b3f106ccb","Type":"ContainerDied","Data":"bd7212e239954001e76b764eb6feb60fb949a6b29c1e4ce763f28d97e3067402"} Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.236751 4914 scope.go:117] "RemoveContainer" containerID="9ae18bc21a2aca04cab51efc6e698de690f056f4177605793cedae1f4edef9f4" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.239049 4914 generic.go:334] "Generic (PLEG): container finished" podID="6fc6e283-e750-42c7-b637-9d6c0c678ff7" containerID="6276afb811cd828f8aeef0234f36d19c367cc9abc38add31d18fc82980aca827" exitCode=0 Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.239109 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" event={"ID":"6fc6e283-e750-42c7-b637-9d6c0c678ff7","Type":"ContainerDied","Data":"6276afb811cd828f8aeef0234f36d19c367cc9abc38add31d18fc82980aca827"} Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.239148 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" event={"ID":"6fc6e283-e750-42c7-b637-9d6c0c678ff7","Type":"ContainerDied","Data":"6c8463ab3355dac793bbd4c92783c560df09481345c67134bc38860ad78eb9ff"} Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.239287 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.247736 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.258843 4914 scope.go:117] "RemoveContainer" containerID="9ae18bc21a2aca04cab51efc6e698de690f056f4177605793cedae1f4edef9f4" Jan 30 21:19:14 crc kubenswrapper[4914]: E0130 21:19:14.259326 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ae18bc21a2aca04cab51efc6e698de690f056f4177605793cedae1f4edef9f4\": container with ID starting with 9ae18bc21a2aca04cab51efc6e698de690f056f4177605793cedae1f4edef9f4 not found: ID does not exist" containerID="9ae18bc21a2aca04cab51efc6e698de690f056f4177605793cedae1f4edef9f4" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.259374 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae18bc21a2aca04cab51efc6e698de690f056f4177605793cedae1f4edef9f4"} err="failed to get container status \"9ae18bc21a2aca04cab51efc6e698de690f056f4177605793cedae1f4edef9f4\": rpc error: code = NotFound desc = could not find container \"9ae18bc21a2aca04cab51efc6e698de690f056f4177605793cedae1f4edef9f4\": container with ID starting with 9ae18bc21a2aca04cab51efc6e698de690f056f4177605793cedae1f4edef9f4 not found: ID does not exist" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.259409 4914 scope.go:117] "RemoveContainer" containerID="6276afb811cd828f8aeef0234f36d19c367cc9abc38add31d18fc82980aca827" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.277152 4914 scope.go:117] "RemoveContainer" containerID="6276afb811cd828f8aeef0234f36d19c367cc9abc38add31d18fc82980aca827" Jan 30 21:19:14 crc kubenswrapper[4914]: E0130 21:19:14.277569 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6276afb811cd828f8aeef0234f36d19c367cc9abc38add31d18fc82980aca827\": container with ID starting with 6276afb811cd828f8aeef0234f36d19c367cc9abc38add31d18fc82980aca827 not found: ID does not exist" containerID="6276afb811cd828f8aeef0234f36d19c367cc9abc38add31d18fc82980aca827" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.277621 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6276afb811cd828f8aeef0234f36d19c367cc9abc38add31d18fc82980aca827"} err="failed to get container status \"6276afb811cd828f8aeef0234f36d19c367cc9abc38add31d18fc82980aca827\": rpc error: code = NotFound desc = could not find container \"6276afb811cd828f8aeef0234f36d19c367cc9abc38add31d18fc82980aca827\": container with ID starting with 6276afb811cd828f8aeef0234f36d19c367cc9abc38add31d18fc82980aca827 not found: ID does not exist" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.288303 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-779f5dd757-wvdx6"] Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.297342 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-779f5dd757-wvdx6"] Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.307161 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds"] Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.314347 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55659f8bb-4hhds"] Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.324225 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.327068 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.444119 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.506846 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.509119 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.627954 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.639686 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.685266 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.725061 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb"] Jan 30 21:19:14 crc kubenswrapper[4914]: E0130 21:19:14.725417 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b35b3b-59cf-4678-aa68-9b8b3f106ccb" containerName="controller-manager" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.725451 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b35b3b-59cf-4678-aa68-9b8b3f106ccb" containerName="controller-manager" Jan 30 21:19:14 crc kubenswrapper[4914]: E0130 21:19:14.725478 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.725491 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 21:19:14 crc kubenswrapper[4914]: E0130 21:19:14.725512 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc6e283-e750-42c7-b637-9d6c0c678ff7" containerName="route-controller-manager" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.725525 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc6e283-e750-42c7-b637-9d6c0c678ff7" containerName="route-controller-manager" Jan 30 21:19:14 crc kubenswrapper[4914]: E0130 21:19:14.725546 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f5bf593-1c26-4d51-a30a-45477c960de6" containerName="installer" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.725559 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5bf593-1c26-4d51-a30a-45477c960de6" containerName="installer" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.725763 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.725787 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f5bf593-1c26-4d51-a30a-45477c960de6" containerName="installer" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.725810 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b35b3b-59cf-4678-aa68-9b8b3f106ccb" containerName="controller-manager" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.725832 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fc6e283-e750-42c7-b637-9d6c0c678ff7" containerName="route-controller-manager" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.726492 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.734964 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b468575dc-m2xgz"] Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.735298 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.737486 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.739215 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.742416 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.742509 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.743181 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.743314 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.743566 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.743691 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.743863 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.742423 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.744183 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.745438 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.750627 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb"] Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.756007 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.757326 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.757756 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b468575dc-m2xgz"] Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.759215 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.792255 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.824699 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.825359 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/402ff0c9-9c77-48da-9888-8ef9f46d776b-serving-cert\") pod \"route-controller-manager-675bb58dd8-wlzjb\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.825426 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/402ff0c9-9c77-48da-9888-8ef9f46d776b-config\") pod \"route-controller-manager-675bb58dd8-wlzjb\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.825474 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/402ff0c9-9c77-48da-9888-8ef9f46d776b-client-ca\") pod \"route-controller-manager-675bb58dd8-wlzjb\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.825591 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v4wk\" (UniqueName: \"kubernetes.io/projected/402ff0c9-9c77-48da-9888-8ef9f46d776b-kube-api-access-2v4wk\") pod \"route-controller-manager-675bb58dd8-wlzjb\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.832606 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.851515 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.879969 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.925153 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.926798 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-config\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.926851 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.926866 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25cqz\" (UniqueName: \"kubernetes.io/projected/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-kube-api-access-25cqz\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.926945 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/402ff0c9-9c77-48da-9888-8ef9f46d776b-serving-cert\") pod \"route-controller-manager-675bb58dd8-wlzjb\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.926987 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/402ff0c9-9c77-48da-9888-8ef9f46d776b-config\") pod \"route-controller-manager-675bb58dd8-wlzjb\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.927035 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/402ff0c9-9c77-48da-9888-8ef9f46d776b-client-ca\") pod \"route-controller-manager-675bb58dd8-wlzjb\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.927098 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-proxy-ca-bundles\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.927141 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-client-ca\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.927271 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-serving-cert\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.927364 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v4wk\" (UniqueName: \"kubernetes.io/projected/402ff0c9-9c77-48da-9888-8ef9f46d776b-kube-api-access-2v4wk\") pod \"route-controller-manager-675bb58dd8-wlzjb\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.929286 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/402ff0c9-9c77-48da-9888-8ef9f46d776b-client-ca\") pod \"route-controller-manager-675bb58dd8-wlzjb\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.929393 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/402ff0c9-9c77-48da-9888-8ef9f46d776b-config\") pod \"route-controller-manager-675bb58dd8-wlzjb\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.937946 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/402ff0c9-9c77-48da-9888-8ef9f46d776b-serving-cert\") pod \"route-controller-manager-675bb58dd8-wlzjb\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:14 crc kubenswrapper[4914]: I0130 21:19:14.950297 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v4wk\" (UniqueName: \"kubernetes.io/projected/402ff0c9-9c77-48da-9888-8ef9f46d776b-kube-api-access-2v4wk\") pod \"route-controller-manager-675bb58dd8-wlzjb\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.030488 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-proxy-ca-bundles\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.030574 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-client-ca\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.030617 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-serving-cert\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.030726 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-config\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.030777 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25cqz\" (UniqueName: \"kubernetes.io/projected/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-kube-api-access-25cqz\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.032334 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-client-ca\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.033148 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-config\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.033306 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-proxy-ca-bundles\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.038255 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-serving-cert\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.048999 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25cqz\" (UniqueName: \"kubernetes.io/projected/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-kube-api-access-25cqz\") pod \"controller-manager-5b468575dc-m2xgz\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.095080 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.119127 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.249063 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.353861 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.395796 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.452031 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.552873 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.603076 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.632235 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.658871 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.794788 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.837291 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.838047 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fc6e283-e750-42c7-b637-9d6c0c678ff7" path="/var/lib/kubelet/pods/6fc6e283-e750-42c7-b637-9d6c0c678ff7/volumes" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.839915 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b35b3b-59cf-4678-aa68-9b8b3f106ccb" path="/var/lib/kubelet/pods/d4b35b3b-59cf-4678-aa68-9b8b3f106ccb/volumes" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.849449 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.894819 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 30 21:19:15 crc kubenswrapper[4914]: I0130 21:19:15.976600 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 30 21:19:16 crc kubenswrapper[4914]: I0130 21:19:16.004498 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 30 21:19:16 crc kubenswrapper[4914]: I0130 21:19:16.048917 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 30 21:19:16 crc kubenswrapper[4914]: I0130 21:19:16.061696 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 30 21:19:16 crc kubenswrapper[4914]: I0130 21:19:16.132969 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 30 21:19:16 crc kubenswrapper[4914]: I0130 21:19:16.149208 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 30 21:19:16 crc kubenswrapper[4914]: I0130 21:19:16.344634 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 30 21:19:16 crc kubenswrapper[4914]: I0130 21:19:16.471558 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 30 21:19:16 crc kubenswrapper[4914]: I0130 21:19:16.567913 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 30 21:19:16 crc kubenswrapper[4914]: I0130 21:19:16.675937 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 30 21:19:16 crc kubenswrapper[4914]: I0130 21:19:16.702499 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 30 21:19:16 crc kubenswrapper[4914]: I0130 21:19:16.893239 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 30 21:19:16 crc kubenswrapper[4914]: I0130 21:19:16.963144 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 21:19:17 crc kubenswrapper[4914]: I0130 21:19:17.108725 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 30 21:19:17 crc kubenswrapper[4914]: I0130 21:19:17.237887 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 30 21:19:17 crc kubenswrapper[4914]: I0130 21:19:17.242085 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 30 21:19:17 crc kubenswrapper[4914]: I0130 21:19:17.373039 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 30 21:19:17 crc kubenswrapper[4914]: I0130 21:19:17.421533 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb"] Jan 30 21:19:17 crc kubenswrapper[4914]: I0130 21:19:17.473589 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 30 21:19:17 crc kubenswrapper[4914]: I0130 21:19:17.540477 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b468575dc-m2xgz"] Jan 30 21:19:17 crc kubenswrapper[4914]: W0130 21:19:17.546729 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb80ad06d_82c8_417c_841d_d41f5b1ae6ab.slice/crio-3f9b90aa17cb5089c3b492574fd26071880c29c9601f1ba9fd4362a4515b35a5 WatchSource:0}: Error finding container 3f9b90aa17cb5089c3b492574fd26071880c29c9601f1ba9fd4362a4515b35a5: Status 404 returned error can't find the container with id 3f9b90aa17cb5089c3b492574fd26071880c29c9601f1ba9fd4362a4515b35a5 Jan 30 21:19:17 crc kubenswrapper[4914]: I0130 21:19:17.624264 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 30 21:19:17 crc kubenswrapper[4914]: I0130 21:19:17.809786 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 30 21:19:17 crc kubenswrapper[4914]: I0130 21:19:17.823013 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 30 21:19:17 crc kubenswrapper[4914]: I0130 21:19:17.847640 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 30 21:19:17 crc kubenswrapper[4914]: I0130 21:19:17.861520 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.144663 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.197631 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.276927 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" event={"ID":"b80ad06d-82c8-417c-841d-d41f5b1ae6ab","Type":"ContainerStarted","Data":"cbdc5310d2625bc4aa55de438ffb6b7daa29d36441b7fb9912cb29472161c53c"} Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.276995 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" event={"ID":"b80ad06d-82c8-417c-841d-d41f5b1ae6ab","Type":"ContainerStarted","Data":"3f9b90aa17cb5089c3b492574fd26071880c29c9601f1ba9fd4362a4515b35a5"} Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.277257 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.279275 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" event={"ID":"402ff0c9-9c77-48da-9888-8ef9f46d776b","Type":"ContainerStarted","Data":"014ac2b533ef0adee0449c5e99f019a29ebd0f7f5cdd529429daadf5d2c9d770"} Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.279314 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" event={"ID":"402ff0c9-9c77-48da-9888-8ef9f46d776b","Type":"ContainerStarted","Data":"44fd0f50018c10d27d984f6e0e0e4c34320a50fe61703ef25a3120e6baa49940"} Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.279619 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.284328 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.285497 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.309665 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" podStartSLOduration=5.309627351 podStartE2EDuration="5.309627351s" podCreationTimestamp="2026-01-30 21:19:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:19:18.297923003 +0000 UTC m=+291.736559764" watchObservedRunningTime="2026-01-30 21:19:18.309627351 +0000 UTC m=+291.748264162" Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.364001 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" podStartSLOduration=5.363976251 podStartE2EDuration="5.363976251s" podCreationTimestamp="2026-01-30 21:19:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:19:18.361510902 +0000 UTC m=+291.800147723" watchObservedRunningTime="2026-01-30 21:19:18.363976251 +0000 UTC m=+291.802613052" Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.752997 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.819586 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.825860 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.945040 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.945116 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.972776 4914 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 30 21:19:18 crc kubenswrapper[4914]: I0130 21:19:18.991976 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.006420 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.006554 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.006583 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.006602 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.006785 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.006831 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.007306 4914 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.007352 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.007391 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.007467 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.017944 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.108922 4914 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.108960 4914 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.108974 4914 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.108987 4914 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.288958 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.289036 4914 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="93f64017c71f5785ddba05a9a764647e6eff82ebfc1ac8440b45ff9d0b414268" exitCode=137 Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.289145 4914 scope.go:117] "RemoveContainer" containerID="93f64017c71f5785ddba05a9a764647e6eff82ebfc1ac8440b45ff9d0b414268" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.289198 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.320691 4914 scope.go:117] "RemoveContainer" containerID="93f64017c71f5785ddba05a9a764647e6eff82ebfc1ac8440b45ff9d0b414268" Jan 30 21:19:19 crc kubenswrapper[4914]: E0130 21:19:19.321369 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f64017c71f5785ddba05a9a764647e6eff82ebfc1ac8440b45ff9d0b414268\": container with ID starting with 93f64017c71f5785ddba05a9a764647e6eff82ebfc1ac8440b45ff9d0b414268 not found: ID does not exist" containerID="93f64017c71f5785ddba05a9a764647e6eff82ebfc1ac8440b45ff9d0b414268" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.321440 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f64017c71f5785ddba05a9a764647e6eff82ebfc1ac8440b45ff9d0b414268"} err="failed to get container status \"93f64017c71f5785ddba05a9a764647e6eff82ebfc1ac8440b45ff9d0b414268\": rpc error: code = NotFound desc = could not find container \"93f64017c71f5785ddba05a9a764647e6eff82ebfc1ac8440b45ff9d0b414268\": container with ID starting with 93f64017c71f5785ddba05a9a764647e6eff82ebfc1ac8440b45ff9d0b414268 not found: ID does not exist" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.831103 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.831418 4914 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.845600 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.845647 4914 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="6d61d55e-b692-4837-843d-01c2d6a94175" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.853145 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.853221 4914 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="6d61d55e-b692-4837-843d-01c2d6a94175" Jan 30 21:19:19 crc kubenswrapper[4914]: I0130 21:19:19.904690 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 30 21:19:20 crc kubenswrapper[4914]: I0130 21:19:20.109989 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 30 21:19:27 crc kubenswrapper[4914]: I0130 21:19:27.045790 4914 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 30 21:19:33 crc kubenswrapper[4914]: I0130 21:19:33.420143 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b468575dc-m2xgz"] Jan 30 21:19:33 crc kubenswrapper[4914]: I0130 21:19:33.421067 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" podUID="b80ad06d-82c8-417c-841d-d41f5b1ae6ab" containerName="controller-manager" containerID="cri-o://cbdc5310d2625bc4aa55de438ffb6b7daa29d36441b7fb9912cb29472161c53c" gracePeriod=30 Jan 30 21:19:33 crc kubenswrapper[4914]: I0130 21:19:33.448494 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb"] Jan 30 21:19:33 crc kubenswrapper[4914]: I0130 21:19:33.448951 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" podUID="402ff0c9-9c77-48da-9888-8ef9f46d776b" containerName="route-controller-manager" containerID="cri-o://014ac2b533ef0adee0449c5e99f019a29ebd0f7f5cdd529429daadf5d2c9d770" gracePeriod=30 Jan 30 21:19:33 crc kubenswrapper[4914]: I0130 21:19:33.936662 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:33 crc kubenswrapper[4914]: I0130 21:19:33.987692 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.021354 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25cqz\" (UniqueName: \"kubernetes.io/projected/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-kube-api-access-25cqz\") pod \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.021427 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-config\") pod \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.021495 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/402ff0c9-9c77-48da-9888-8ef9f46d776b-client-ca\") pod \"402ff0c9-9c77-48da-9888-8ef9f46d776b\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.021529 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-client-ca\") pod \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.021573 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v4wk\" (UniqueName: \"kubernetes.io/projected/402ff0c9-9c77-48da-9888-8ef9f46d776b-kube-api-access-2v4wk\") pod \"402ff0c9-9c77-48da-9888-8ef9f46d776b\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.021672 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/402ff0c9-9c77-48da-9888-8ef9f46d776b-config\") pod \"402ff0c9-9c77-48da-9888-8ef9f46d776b\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.021728 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-serving-cert\") pod \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.021770 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/402ff0c9-9c77-48da-9888-8ef9f46d776b-serving-cert\") pod \"402ff0c9-9c77-48da-9888-8ef9f46d776b\" (UID: \"402ff0c9-9c77-48da-9888-8ef9f46d776b\") " Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.021808 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-proxy-ca-bundles\") pod \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\" (UID: \"b80ad06d-82c8-417c-841d-d41f5b1ae6ab\") " Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.023264 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b80ad06d-82c8-417c-841d-d41f5b1ae6ab" (UID: "b80ad06d-82c8-417c-841d-d41f5b1ae6ab"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.023581 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-client-ca" (OuterVolumeSpecName: "client-ca") pod "b80ad06d-82c8-417c-841d-d41f5b1ae6ab" (UID: "b80ad06d-82c8-417c-841d-d41f5b1ae6ab"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.024585 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-config" (OuterVolumeSpecName: "config") pod "b80ad06d-82c8-417c-841d-d41f5b1ae6ab" (UID: "b80ad06d-82c8-417c-841d-d41f5b1ae6ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.025163 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/402ff0c9-9c77-48da-9888-8ef9f46d776b-config" (OuterVolumeSpecName: "config") pod "402ff0c9-9c77-48da-9888-8ef9f46d776b" (UID: "402ff0c9-9c77-48da-9888-8ef9f46d776b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.025228 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/402ff0c9-9c77-48da-9888-8ef9f46d776b-client-ca" (OuterVolumeSpecName: "client-ca") pod "402ff0c9-9c77-48da-9888-8ef9f46d776b" (UID: "402ff0c9-9c77-48da-9888-8ef9f46d776b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.030281 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402ff0c9-9c77-48da-9888-8ef9f46d776b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "402ff0c9-9c77-48da-9888-8ef9f46d776b" (UID: "402ff0c9-9c77-48da-9888-8ef9f46d776b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.032901 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402ff0c9-9c77-48da-9888-8ef9f46d776b-kube-api-access-2v4wk" (OuterVolumeSpecName: "kube-api-access-2v4wk") pod "402ff0c9-9c77-48da-9888-8ef9f46d776b" (UID: "402ff0c9-9c77-48da-9888-8ef9f46d776b"). InnerVolumeSpecName "kube-api-access-2v4wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.032929 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-kube-api-access-25cqz" (OuterVolumeSpecName: "kube-api-access-25cqz") pod "b80ad06d-82c8-417c-841d-d41f5b1ae6ab" (UID: "b80ad06d-82c8-417c-841d-d41f5b1ae6ab"). InnerVolumeSpecName "kube-api-access-25cqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.032989 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b80ad06d-82c8-417c-841d-d41f5b1ae6ab" (UID: "b80ad06d-82c8-417c-841d-d41f5b1ae6ab"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.122970 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25cqz\" (UniqueName: \"kubernetes.io/projected/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-kube-api-access-25cqz\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.123004 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.123014 4914 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/402ff0c9-9c77-48da-9888-8ef9f46d776b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.123023 4914 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.123032 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v4wk\" (UniqueName: \"kubernetes.io/projected/402ff0c9-9c77-48da-9888-8ef9f46d776b-kube-api-access-2v4wk\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.123040 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/402ff0c9-9c77-48da-9888-8ef9f46d776b-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.123047 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.123056 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/402ff0c9-9c77-48da-9888-8ef9f46d776b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.123065 4914 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b80ad06d-82c8-417c-841d-d41f5b1ae6ab-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.392633 4914 generic.go:334] "Generic (PLEG): container finished" podID="402ff0c9-9c77-48da-9888-8ef9f46d776b" containerID="014ac2b533ef0adee0449c5e99f019a29ebd0f7f5cdd529429daadf5d2c9d770" exitCode=0 Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.392724 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" event={"ID":"402ff0c9-9c77-48da-9888-8ef9f46d776b","Type":"ContainerDied","Data":"014ac2b533ef0adee0449c5e99f019a29ebd0f7f5cdd529429daadf5d2c9d770"} Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.392788 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" event={"ID":"402ff0c9-9c77-48da-9888-8ef9f46d776b","Type":"ContainerDied","Data":"44fd0f50018c10d27d984f6e0e0e4c34320a50fe61703ef25a3120e6baa49940"} Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.392811 4914 scope.go:117] "RemoveContainer" containerID="014ac2b533ef0adee0449c5e99f019a29ebd0f7f5cdd529429daadf5d2c9d770" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.392753 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.394469 4914 generic.go:334] "Generic (PLEG): container finished" podID="b80ad06d-82c8-417c-841d-d41f5b1ae6ab" containerID="cbdc5310d2625bc4aa55de438ffb6b7daa29d36441b7fb9912cb29472161c53c" exitCode=0 Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.394517 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" event={"ID":"b80ad06d-82c8-417c-841d-d41f5b1ae6ab","Type":"ContainerDied","Data":"cbdc5310d2625bc4aa55de438ffb6b7daa29d36441b7fb9912cb29472161c53c"} Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.394558 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" event={"ID":"b80ad06d-82c8-417c-841d-d41f5b1ae6ab","Type":"ContainerDied","Data":"3f9b90aa17cb5089c3b492574fd26071880c29c9601f1ba9fd4362a4515b35a5"} Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.394529 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b468575dc-m2xgz" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.411807 4914 scope.go:117] "RemoveContainer" containerID="014ac2b533ef0adee0449c5e99f019a29ebd0f7f5cdd529429daadf5d2c9d770" Jan 30 21:19:34 crc kubenswrapper[4914]: E0130 21:19:34.412510 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"014ac2b533ef0adee0449c5e99f019a29ebd0f7f5cdd529429daadf5d2c9d770\": container with ID starting with 014ac2b533ef0adee0449c5e99f019a29ebd0f7f5cdd529429daadf5d2c9d770 not found: ID does not exist" containerID="014ac2b533ef0adee0449c5e99f019a29ebd0f7f5cdd529429daadf5d2c9d770" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.412559 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014ac2b533ef0adee0449c5e99f019a29ebd0f7f5cdd529429daadf5d2c9d770"} err="failed to get container status \"014ac2b533ef0adee0449c5e99f019a29ebd0f7f5cdd529429daadf5d2c9d770\": rpc error: code = NotFound desc = could not find container \"014ac2b533ef0adee0449c5e99f019a29ebd0f7f5cdd529429daadf5d2c9d770\": container with ID starting with 014ac2b533ef0adee0449c5e99f019a29ebd0f7f5cdd529429daadf5d2c9d770 not found: ID does not exist" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.412595 4914 scope.go:117] "RemoveContainer" containerID="cbdc5310d2625bc4aa55de438ffb6b7daa29d36441b7fb9912cb29472161c53c" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.429543 4914 scope.go:117] "RemoveContainer" containerID="cbdc5310d2625bc4aa55de438ffb6b7daa29d36441b7fb9912cb29472161c53c" Jan 30 21:19:34 crc kubenswrapper[4914]: E0130 21:19:34.435842 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbdc5310d2625bc4aa55de438ffb6b7daa29d36441b7fb9912cb29472161c53c\": container with ID starting with cbdc5310d2625bc4aa55de438ffb6b7daa29d36441b7fb9912cb29472161c53c not found: ID does not exist" containerID="cbdc5310d2625bc4aa55de438ffb6b7daa29d36441b7fb9912cb29472161c53c" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.435932 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbdc5310d2625bc4aa55de438ffb6b7daa29d36441b7fb9912cb29472161c53c"} err="failed to get container status \"cbdc5310d2625bc4aa55de438ffb6b7daa29d36441b7fb9912cb29472161c53c\": rpc error: code = NotFound desc = could not find container \"cbdc5310d2625bc4aa55de438ffb6b7daa29d36441b7fb9912cb29472161c53c\": container with ID starting with cbdc5310d2625bc4aa55de438ffb6b7daa29d36441b7fb9912cb29472161c53c not found: ID does not exist" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.454616 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b468575dc-m2xgz"] Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.460236 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b468575dc-m2xgz"] Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.464821 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb"] Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.470019 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-675bb58dd8-wlzjb"] Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.735921 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr"] Jan 30 21:19:34 crc kubenswrapper[4914]: E0130 21:19:34.736237 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402ff0c9-9c77-48da-9888-8ef9f46d776b" containerName="route-controller-manager" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.736251 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="402ff0c9-9c77-48da-9888-8ef9f46d776b" containerName="route-controller-manager" Jan 30 21:19:34 crc kubenswrapper[4914]: E0130 21:19:34.736264 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b80ad06d-82c8-417c-841d-d41f5b1ae6ab" containerName="controller-manager" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.736272 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b80ad06d-82c8-417c-841d-d41f5b1ae6ab" containerName="controller-manager" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.736420 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="402ff0c9-9c77-48da-9888-8ef9f46d776b" containerName="route-controller-manager" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.736432 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="b80ad06d-82c8-417c-841d-d41f5b1ae6ab" containerName="controller-manager" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.736926 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.738930 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh"] Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.739514 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.740451 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.740671 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.740801 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.741014 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.741311 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.743854 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.744403 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.744564 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.745817 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.745996 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.747784 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr"] Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.748103 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.748358 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.755798 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.765469 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh"] Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.833742 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76382623-8cac-42a6-a9de-436e96a8a153-client-ca\") pod \"route-controller-manager-5c6b579757-q6zlh\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.834080 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e090fd50-eb0c-4f2c-b829-a3e446703cdf-serving-cert\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.834326 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76382623-8cac-42a6-a9de-436e96a8a153-config\") pod \"route-controller-manager-5c6b579757-q6zlh\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.834441 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-client-ca\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.834601 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr58t\" (UniqueName: \"kubernetes.io/projected/e090fd50-eb0c-4f2c-b829-a3e446703cdf-kube-api-access-xr58t\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.834802 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-config\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.835009 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dnl4\" (UniqueName: \"kubernetes.io/projected/76382623-8cac-42a6-a9de-436e96a8a153-kube-api-access-8dnl4\") pod \"route-controller-manager-5c6b579757-q6zlh\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.835142 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-proxy-ca-bundles\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.835263 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76382623-8cac-42a6-a9de-436e96a8a153-serving-cert\") pod \"route-controller-manager-5c6b579757-q6zlh\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.935968 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dnl4\" (UniqueName: \"kubernetes.io/projected/76382623-8cac-42a6-a9de-436e96a8a153-kube-api-access-8dnl4\") pod \"route-controller-manager-5c6b579757-q6zlh\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.936034 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-proxy-ca-bundles\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.936058 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76382623-8cac-42a6-a9de-436e96a8a153-serving-cert\") pod \"route-controller-manager-5c6b579757-q6zlh\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.936097 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76382623-8cac-42a6-a9de-436e96a8a153-client-ca\") pod \"route-controller-manager-5c6b579757-q6zlh\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.936119 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e090fd50-eb0c-4f2c-b829-a3e446703cdf-serving-cert\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.936178 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76382623-8cac-42a6-a9de-436e96a8a153-config\") pod \"route-controller-manager-5c6b579757-q6zlh\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.936203 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-client-ca\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.936230 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr58t\" (UniqueName: \"kubernetes.io/projected/e090fd50-eb0c-4f2c-b829-a3e446703cdf-kube-api-access-xr58t\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.936259 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-config\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.937969 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-config\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.939294 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76382623-8cac-42a6-a9de-436e96a8a153-config\") pod \"route-controller-manager-5c6b579757-q6zlh\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.939424 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76382623-8cac-42a6-a9de-436e96a8a153-client-ca\") pod \"route-controller-manager-5c6b579757-q6zlh\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.939527 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-client-ca\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.939581 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-proxy-ca-bundles\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.945629 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76382623-8cac-42a6-a9de-436e96a8a153-serving-cert\") pod \"route-controller-manager-5c6b579757-q6zlh\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.948024 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e090fd50-eb0c-4f2c-b829-a3e446703cdf-serving-cert\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.958541 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr58t\" (UniqueName: \"kubernetes.io/projected/e090fd50-eb0c-4f2c-b829-a3e446703cdf-kube-api-access-xr58t\") pod \"controller-manager-c5ffdcbcc-4zfnr\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:34 crc kubenswrapper[4914]: I0130 21:19:34.964113 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dnl4\" (UniqueName: \"kubernetes.io/projected/76382623-8cac-42a6-a9de-436e96a8a153-kube-api-access-8dnl4\") pod \"route-controller-manager-5c6b579757-q6zlh\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:35 crc kubenswrapper[4914]: I0130 21:19:35.080841 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:35 crc kubenswrapper[4914]: I0130 21:19:35.096093 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:35 crc kubenswrapper[4914]: I0130 21:19:35.293676 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr"] Jan 30 21:19:35 crc kubenswrapper[4914]: I0130 21:19:35.399304 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" event={"ID":"e090fd50-eb0c-4f2c-b829-a3e446703cdf","Type":"ContainerStarted","Data":"af0534ca4ded0c44356556de2201e119dd22a80fb5dcc6d1d1c331f0adc7134c"} Jan 30 21:19:35 crc kubenswrapper[4914]: I0130 21:19:35.572174 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh"] Jan 30 21:19:35 crc kubenswrapper[4914]: W0130 21:19:35.580370 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76382623_8cac_42a6_a9de_436e96a8a153.slice/crio-16c765e7502fd19f52998d5f9440c286b1b824c3aa4d2b41986c9a2824364c9e WatchSource:0}: Error finding container 16c765e7502fd19f52998d5f9440c286b1b824c3aa4d2b41986c9a2824364c9e: Status 404 returned error can't find the container with id 16c765e7502fd19f52998d5f9440c286b1b824c3aa4d2b41986c9a2824364c9e Jan 30 21:19:35 crc kubenswrapper[4914]: I0130 21:19:35.825012 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="402ff0c9-9c77-48da-9888-8ef9f46d776b" path="/var/lib/kubelet/pods/402ff0c9-9c77-48da-9888-8ef9f46d776b/volumes" Jan 30 21:19:35 crc kubenswrapper[4914]: I0130 21:19:35.825855 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b80ad06d-82c8-417c-841d-d41f5b1ae6ab" path="/var/lib/kubelet/pods/b80ad06d-82c8-417c-841d-d41f5b1ae6ab/volumes" Jan 30 21:19:36 crc kubenswrapper[4914]: I0130 21:19:36.410949 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" event={"ID":"e090fd50-eb0c-4f2c-b829-a3e446703cdf","Type":"ContainerStarted","Data":"de4fd6bc754c4c551e0d66e96f2c591ac8f20b6b6fc9665f3f7ef82478b1a4e6"} Jan 30 21:19:36 crc kubenswrapper[4914]: I0130 21:19:36.411365 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:36 crc kubenswrapper[4914]: I0130 21:19:36.413208 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" event={"ID":"76382623-8cac-42a6-a9de-436e96a8a153","Type":"ContainerStarted","Data":"c31672a24d5a479b1797fba08654d3bfc374353aecf6b7ea5f8a9bf4eaf5f8bf"} Jan 30 21:19:36 crc kubenswrapper[4914]: I0130 21:19:36.413244 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" event={"ID":"76382623-8cac-42a6-a9de-436e96a8a153","Type":"ContainerStarted","Data":"16c765e7502fd19f52998d5f9440c286b1b824c3aa4d2b41986c9a2824364c9e"} Jan 30 21:19:36 crc kubenswrapper[4914]: I0130 21:19:36.413599 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:36 crc kubenswrapper[4914]: I0130 21:19:36.420623 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:36 crc kubenswrapper[4914]: I0130 21:19:36.434613 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" podStartSLOduration=3.434591926 podStartE2EDuration="3.434591926s" podCreationTimestamp="2026-01-30 21:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:19:36.433310821 +0000 UTC m=+309.871947652" watchObservedRunningTime="2026-01-30 21:19:36.434591926 +0000 UTC m=+309.873228697" Jan 30 21:19:36 crc kubenswrapper[4914]: I0130 21:19:36.615220 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:36 crc kubenswrapper[4914]: I0130 21:19:36.632072 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" podStartSLOduration=3.632052754 podStartE2EDuration="3.632052754s" podCreationTimestamp="2026-01-30 21:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:19:36.491696245 +0000 UTC m=+309.930333006" watchObservedRunningTime="2026-01-30 21:19:36.632052754 +0000 UTC m=+310.070689515" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.184167 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-42klg"] Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.191877 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-42klg" podUID="d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" containerName="registry-server" containerID="cri-o://5cc507b6043f61639986c694184303cff546e5f774a95bc245af04a66d96715c" gracePeriod=30 Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.192073 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z85fs"] Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.195241 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z85fs" podUID="6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" containerName="registry-server" containerID="cri-o://b49d3eb1e53fd4e0b37f8a187a1c4c3771aba2a95b2c9cb26548870333c8e72c" gracePeriod=30 Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.198062 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wx2ts"] Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.198216 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" podUID="6a2f6adb-e5cc-43f7-974d-11bae45ddbcc" containerName="marketplace-operator" containerID="cri-o://db027857765b1fc2bdd884c7974fe92dd1fcbc6a49a614c10365565b650719c5" gracePeriod=30 Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.211359 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8nmx"] Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.212908 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f8nmx" podUID="cfdb54ed-594a-4867-b500-68bdd392ce12" containerName="registry-server" containerID="cri-o://c6267c12116c170fb3f3e060cfc1e5b3851a14d211f134b34989048f6e57a9d2" gracePeriod=30 Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.218273 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4bzx9"] Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.218546 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4bzx9" podUID="e8b53784-6398-419a-84b0-65f2550636a5" containerName="registry-server" containerID="cri-o://3a4076445fd86f783fb252a2b1b6e4df818eb9ad47dec86d6534ca6490d229f9" gracePeriod=30 Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.228981 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hzn7g"] Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.229663 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.233258 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hzn7g"] Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.302118 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5d539af9-a76b-48db-b786-ab58c8e1d2cf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hzn7g\" (UID: \"5d539af9-a76b-48db-b786-ab58c8e1d2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.302394 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f648b\" (UniqueName: \"kubernetes.io/projected/5d539af9-a76b-48db-b786-ab58c8e1d2cf-kube-api-access-f648b\") pod \"marketplace-operator-79b997595-hzn7g\" (UID: \"5d539af9-a76b-48db-b786-ab58c8e1d2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.302438 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d539af9-a76b-48db-b786-ab58c8e1d2cf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hzn7g\" (UID: \"5d539af9-a76b-48db-b786-ab58c8e1d2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.418386 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d539af9-a76b-48db-b786-ab58c8e1d2cf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hzn7g\" (UID: \"5d539af9-a76b-48db-b786-ab58c8e1d2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.418476 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5d539af9-a76b-48db-b786-ab58c8e1d2cf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hzn7g\" (UID: \"5d539af9-a76b-48db-b786-ab58c8e1d2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.418515 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f648b\" (UniqueName: \"kubernetes.io/projected/5d539af9-a76b-48db-b786-ab58c8e1d2cf-kube-api-access-f648b\") pod \"marketplace-operator-79b997595-hzn7g\" (UID: \"5d539af9-a76b-48db-b786-ab58c8e1d2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.419855 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d539af9-a76b-48db-b786-ab58c8e1d2cf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hzn7g\" (UID: \"5d539af9-a76b-48db-b786-ab58c8e1d2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.430771 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5d539af9-a76b-48db-b786-ab58c8e1d2cf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hzn7g\" (UID: \"5d539af9-a76b-48db-b786-ab58c8e1d2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.440506 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f648b\" (UniqueName: \"kubernetes.io/projected/5d539af9-a76b-48db-b786-ab58c8e1d2cf-kube-api-access-f648b\") pod \"marketplace-operator-79b997595-hzn7g\" (UID: \"5d539af9-a76b-48db-b786-ab58c8e1d2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.492138 4914 generic.go:334] "Generic (PLEG): container finished" podID="6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" containerID="b49d3eb1e53fd4e0b37f8a187a1c4c3771aba2a95b2c9cb26548870333c8e72c" exitCode=0 Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.492188 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z85fs" event={"ID":"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b","Type":"ContainerDied","Data":"b49d3eb1e53fd4e0b37f8a187a1c4c3771aba2a95b2c9cb26548870333c8e72c"} Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.494010 4914 generic.go:334] "Generic (PLEG): container finished" podID="cfdb54ed-594a-4867-b500-68bdd392ce12" containerID="c6267c12116c170fb3f3e060cfc1e5b3851a14d211f134b34989048f6e57a9d2" exitCode=0 Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.494047 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8nmx" event={"ID":"cfdb54ed-594a-4867-b500-68bdd392ce12","Type":"ContainerDied","Data":"c6267c12116c170fb3f3e060cfc1e5b3851a14d211f134b34989048f6e57a9d2"} Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.495482 4914 generic.go:334] "Generic (PLEG): container finished" podID="e8b53784-6398-419a-84b0-65f2550636a5" containerID="3a4076445fd86f783fb252a2b1b6e4df818eb9ad47dec86d6534ca6490d229f9" exitCode=0 Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.495516 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4bzx9" event={"ID":"e8b53784-6398-419a-84b0-65f2550636a5","Type":"ContainerDied","Data":"3a4076445fd86f783fb252a2b1b6e4df818eb9ad47dec86d6534ca6490d229f9"} Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.496610 4914 generic.go:334] "Generic (PLEG): container finished" podID="6a2f6adb-e5cc-43f7-974d-11bae45ddbcc" containerID="db027857765b1fc2bdd884c7974fe92dd1fcbc6a49a614c10365565b650719c5" exitCode=0 Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.496645 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" event={"ID":"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc","Type":"ContainerDied","Data":"db027857765b1fc2bdd884c7974fe92dd1fcbc6a49a614c10365565b650719c5"} Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.499548 4914 generic.go:334] "Generic (PLEG): container finished" podID="d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" containerID="5cc507b6043f61639986c694184303cff546e5f774a95bc245af04a66d96715c" exitCode=0 Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.499593 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42klg" event={"ID":"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf","Type":"ContainerDied","Data":"5cc507b6043f61639986c694184303cff546e5f774a95bc245af04a66d96715c"} Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.617672 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2cd62"] Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.645161 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.651454 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.727675 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lngzb\" (UniqueName: \"kubernetes.io/projected/cfdb54ed-594a-4867-b500-68bdd392ce12-kube-api-access-lngzb\") pod \"cfdb54ed-594a-4867-b500-68bdd392ce12\" (UID: \"cfdb54ed-594a-4867-b500-68bdd392ce12\") " Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.727751 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfdb54ed-594a-4867-b500-68bdd392ce12-catalog-content\") pod \"cfdb54ed-594a-4867-b500-68bdd392ce12\" (UID: \"cfdb54ed-594a-4867-b500-68bdd392ce12\") " Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.727770 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfdb54ed-594a-4867-b500-68bdd392ce12-utilities\") pod \"cfdb54ed-594a-4867-b500-68bdd392ce12\" (UID: \"cfdb54ed-594a-4867-b500-68bdd392ce12\") " Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.731412 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfdb54ed-594a-4867-b500-68bdd392ce12-utilities" (OuterVolumeSpecName: "utilities") pod "cfdb54ed-594a-4867-b500-68bdd392ce12" (UID: "cfdb54ed-594a-4867-b500-68bdd392ce12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.732013 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfdb54ed-594a-4867-b500-68bdd392ce12-kube-api-access-lngzb" (OuterVolumeSpecName: "kube-api-access-lngzb") pod "cfdb54ed-594a-4867-b500-68bdd392ce12" (UID: "cfdb54ed-594a-4867-b500-68bdd392ce12"). InnerVolumeSpecName "kube-api-access-lngzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.753203 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfdb54ed-594a-4867-b500-68bdd392ce12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfdb54ed-594a-4867-b500-68bdd392ce12" (UID: "cfdb54ed-594a-4867-b500-68bdd392ce12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.800144 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.828592 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w5t2\" (UniqueName: \"kubernetes.io/projected/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-kube-api-access-5w5t2\") pod \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\" (UID: \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\") " Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.828651 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-catalog-content\") pod \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\" (UID: \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\") " Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.828725 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-utilities\") pod \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\" (UID: \"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b\") " Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.828974 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lngzb\" (UniqueName: \"kubernetes.io/projected/cfdb54ed-594a-4867-b500-68bdd392ce12-kube-api-access-lngzb\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.828990 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfdb54ed-594a-4867-b500-68bdd392ce12-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.829002 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfdb54ed-594a-4867-b500-68bdd392ce12-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.829698 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-utilities" (OuterVolumeSpecName: "utilities") pod "6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" (UID: "6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.835637 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-kube-api-access-5w5t2" (OuterVolumeSpecName: "kube-api-access-5w5t2") pod "6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" (UID: "6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b"). InnerVolumeSpecName "kube-api-access-5w5t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.931758 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w5t2\" (UniqueName: \"kubernetes.io/projected/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-kube-api-access-5w5t2\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.931792 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.949047 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" (UID: "6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:48 crc kubenswrapper[4914]: I0130 21:19:48.987364 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.032220 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-marketplace-operator-metrics\") pod \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\" (UID: \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\") " Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.032331 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-marketplace-trusted-ca\") pod \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\" (UID: \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\") " Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.032400 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp2b9\" (UniqueName: \"kubernetes.io/projected/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-kube-api-access-xp2b9\") pod \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\" (UID: \"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc\") " Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.032647 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.035864 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-kube-api-access-xp2b9" (OuterVolumeSpecName: "kube-api-access-xp2b9") pod "6a2f6adb-e5cc-43f7-974d-11bae45ddbcc" (UID: "6a2f6adb-e5cc-43f7-974d-11bae45ddbcc"). InnerVolumeSpecName "kube-api-access-xp2b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.036063 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6a2f6adb-e5cc-43f7-974d-11bae45ddbcc" (UID: "6a2f6adb-e5cc-43f7-974d-11bae45ddbcc"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.040244 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6a2f6adb-e5cc-43f7-974d-11bae45ddbcc" (UID: "6a2f6adb-e5cc-43f7-974d-11bae45ddbcc"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.065626 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.081670 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.133353 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkdr8\" (UniqueName: \"kubernetes.io/projected/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-kube-api-access-xkdr8\") pod \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\" (UID: \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\") " Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.133782 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bspw\" (UniqueName: \"kubernetes.io/projected/e8b53784-6398-419a-84b0-65f2550636a5-kube-api-access-8bspw\") pod \"e8b53784-6398-419a-84b0-65f2550636a5\" (UID: \"e8b53784-6398-419a-84b0-65f2550636a5\") " Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.133855 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-catalog-content\") pod \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\" (UID: \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\") " Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.133902 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8b53784-6398-419a-84b0-65f2550636a5-utilities\") pod \"e8b53784-6398-419a-84b0-65f2550636a5\" (UID: \"e8b53784-6398-419a-84b0-65f2550636a5\") " Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.133926 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8b53784-6398-419a-84b0-65f2550636a5-catalog-content\") pod \"e8b53784-6398-419a-84b0-65f2550636a5\" (UID: \"e8b53784-6398-419a-84b0-65f2550636a5\") " Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.133959 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-utilities\") pod \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\" (UID: \"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf\") " Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.134193 4914 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.134217 4914 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.134230 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp2b9\" (UniqueName: \"kubernetes.io/projected/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc-kube-api-access-xp2b9\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.134773 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8b53784-6398-419a-84b0-65f2550636a5-utilities" (OuterVolumeSpecName: "utilities") pod "e8b53784-6398-419a-84b0-65f2550636a5" (UID: "e8b53784-6398-419a-84b0-65f2550636a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.134992 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-utilities" (OuterVolumeSpecName: "utilities") pod "d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" (UID: "d7bb25c2-cc0d-43a1-84ba-9b60c8298acf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.136677 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8b53784-6398-419a-84b0-65f2550636a5-kube-api-access-8bspw" (OuterVolumeSpecName: "kube-api-access-8bspw") pod "e8b53784-6398-419a-84b0-65f2550636a5" (UID: "e8b53784-6398-419a-84b0-65f2550636a5"). InnerVolumeSpecName "kube-api-access-8bspw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.142256 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-kube-api-access-xkdr8" (OuterVolumeSpecName: "kube-api-access-xkdr8") pod "d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" (UID: "d7bb25c2-cc0d-43a1-84ba-9b60c8298acf"). InnerVolumeSpecName "kube-api-access-xkdr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.179022 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" (UID: "d7bb25c2-cc0d-43a1-84ba-9b60c8298acf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.196356 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hzn7g"] Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.235747 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkdr8\" (UniqueName: \"kubernetes.io/projected/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-kube-api-access-xkdr8\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.235777 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bspw\" (UniqueName: \"kubernetes.io/projected/e8b53784-6398-419a-84b0-65f2550636a5-kube-api-access-8bspw\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.235787 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.235796 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8b53784-6398-419a-84b0-65f2550636a5-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.235805 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.294698 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8b53784-6398-419a-84b0-65f2550636a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8b53784-6398-419a-84b0-65f2550636a5" (UID: "e8b53784-6398-419a-84b0-65f2550636a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.336845 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8b53784-6398-419a-84b0-65f2550636a5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.506611 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8nmx" event={"ID":"cfdb54ed-594a-4867-b500-68bdd392ce12","Type":"ContainerDied","Data":"f8e2be05e690a5929edb936fcae3378d41ea84001f475930197010ef89a2edc1"} Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.506670 4914 scope.go:117] "RemoveContainer" containerID="c6267c12116c170fb3f3e060cfc1e5b3851a14d211f134b34989048f6e57a9d2" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.506826 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8nmx" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.515692 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4bzx9" event={"ID":"e8b53784-6398-419a-84b0-65f2550636a5","Type":"ContainerDied","Data":"06386b98914b712503f312d0b138a05635fc83e6c2a0e800d631af1ce9c26417"} Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.515853 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4bzx9" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.518811 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.518823 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wx2ts" event={"ID":"6a2f6adb-e5cc-43f7-974d-11bae45ddbcc","Type":"ContainerDied","Data":"846f0b8ae04475f141760d2b2987cebb8aa92656a57a89a57c7eacf1b8dd04b6"} Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.520592 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42klg" event={"ID":"d7bb25c2-cc0d-43a1-84ba-9b60c8298acf","Type":"ContainerDied","Data":"380af97274196b65d60ae72bd7c08f545cb5caf72178eb55548b3eee65510884"} Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.520664 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42klg" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.524910 4914 scope.go:117] "RemoveContainer" containerID="cbed8f60ec766e8b254e01c63e988a7a57636a0ee2316fb74de55aaa8e56c01c" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.530827 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z85fs" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.530826 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z85fs" event={"ID":"6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b","Type":"ContainerDied","Data":"b6a78715b11f3aa7fe1d9ae8702fae502b3c24ec7804ce39f4d9d9a1ddad584f"} Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.533805 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" event={"ID":"5d539af9-a76b-48db-b786-ab58c8e1d2cf","Type":"ContainerStarted","Data":"ad52e7c15b47549e47e75425842f78650144d785fe33d0d0952a8881b450ddef"} Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.533854 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" event={"ID":"5d539af9-a76b-48db-b786-ab58c8e1d2cf","Type":"ContainerStarted","Data":"b9a3c4b1700fbed4d938951ee1813ba1a294e04eea4d32ad695c495efac46243"} Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.534170 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.535215 4914 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hzn7g container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" start-of-body= Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.535267 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" podUID="5d539af9-a76b-48db-b786-ab58c8e1d2cf" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.552945 4914 scope.go:117] "RemoveContainer" containerID="a855e2c6e2cb25d8c669645986034d6797b267b35861d80788c2f6f68ffcbaa3" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.561418 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" podStartSLOduration=1.561398523 podStartE2EDuration="1.561398523s" podCreationTimestamp="2026-01-30 21:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:19:49.55914006 +0000 UTC m=+322.997776811" watchObservedRunningTime="2026-01-30 21:19:49.561398523 +0000 UTC m=+323.000035284" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.674278 4914 scope.go:117] "RemoveContainer" containerID="3a4076445fd86f783fb252a2b1b6e4df818eb9ad47dec86d6534ca6490d229f9" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.686367 4914 scope.go:117] "RemoveContainer" containerID="48be772e06de6dd15eb25e0b49fd16a1dd34eeaf3eb52ac4770a240031136397" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.688056 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8nmx"] Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.692585 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8nmx"] Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.704954 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4bzx9"] Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.708027 4914 scope.go:117] "RemoveContainer" containerID="a2fe354cf389270fef530a8d62899781b625c890a229f71b72f0b4bf6c4fe824" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.709407 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4bzx9"] Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.717039 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-42klg"] Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.722028 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-42klg"] Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.723059 4914 scope.go:117] "RemoveContainer" containerID="db027857765b1fc2bdd884c7974fe92dd1fcbc6a49a614c10365565b650719c5" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.739414 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wx2ts"] Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.743150 4914 scope.go:117] "RemoveContainer" containerID="5cc507b6043f61639986c694184303cff546e5f774a95bc245af04a66d96715c" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.744274 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wx2ts"] Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.747002 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z85fs"] Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.750632 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z85fs"] Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.759678 4914 scope.go:117] "RemoveContainer" containerID="8095bc63239cb55d603b7b78cc833db4a62e3902b4bb8e7c7a9fe7b1e097fbd2" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.787063 4914 scope.go:117] "RemoveContainer" containerID="c88984954f157771adfc22785cda8de9fee7379cf6cf5bde8555bc32fa03831b" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.802731 4914 scope.go:117] "RemoveContainer" containerID="b49d3eb1e53fd4e0b37f8a187a1c4c3771aba2a95b2c9cb26548870333c8e72c" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.814281 4914 scope.go:117] "RemoveContainer" containerID="166e18538ff97c4ffa88890caded12503c36477cf5e97e23b97844e85d8a353d" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.824010 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a2f6adb-e5cc-43f7-974d-11bae45ddbcc" path="/var/lib/kubelet/pods/6a2f6adb-e5cc-43f7-974d-11bae45ddbcc/volumes" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.824867 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" path="/var/lib/kubelet/pods/6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b/volumes" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.825402 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfdb54ed-594a-4867-b500-68bdd392ce12" path="/var/lib/kubelet/pods/cfdb54ed-594a-4867-b500-68bdd392ce12/volumes" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.826417 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" path="/var/lib/kubelet/pods/d7bb25c2-cc0d-43a1-84ba-9b60c8298acf/volumes" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.827024 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8b53784-6398-419a-84b0-65f2550636a5" path="/var/lib/kubelet/pods/e8b53784-6398-419a-84b0-65f2550636a5/volumes" Jan 30 21:19:49 crc kubenswrapper[4914]: I0130 21:19:49.836870 4914 scope.go:117] "RemoveContainer" containerID="f5ee530474bc75f60291b433415f94978996daa68c1d8a593f40cb3cb82a8824" Jan 30 21:19:50 crc kubenswrapper[4914]: I0130 21:19:50.549189 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hzn7g" Jan 30 21:19:53 crc kubenswrapper[4914]: I0130 21:19:53.381611 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr"] Jan 30 21:19:53 crc kubenswrapper[4914]: I0130 21:19:53.382192 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" podUID="e090fd50-eb0c-4f2c-b829-a3e446703cdf" containerName="controller-manager" containerID="cri-o://de4fd6bc754c4c551e0d66e96f2c591ac8f20b6b6fc9665f3f7ef82478b1a4e6" gracePeriod=30 Jan 30 21:19:53 crc kubenswrapper[4914]: I0130 21:19:53.481668 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh"] Jan 30 21:19:53 crc kubenswrapper[4914]: I0130 21:19:53.482768 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" podUID="76382623-8cac-42a6-a9de-436e96a8a153" containerName="route-controller-manager" containerID="cri-o://c31672a24d5a479b1797fba08654d3bfc374353aecf6b7ea5f8a9bf4eaf5f8bf" gracePeriod=30 Jan 30 21:19:53 crc kubenswrapper[4914]: I0130 21:19:53.560060 4914 generic.go:334] "Generic (PLEG): container finished" podID="e090fd50-eb0c-4f2c-b829-a3e446703cdf" containerID="de4fd6bc754c4c551e0d66e96f2c591ac8f20b6b6fc9665f3f7ef82478b1a4e6" exitCode=0 Jan 30 21:19:53 crc kubenswrapper[4914]: I0130 21:19:53.560103 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" event={"ID":"e090fd50-eb0c-4f2c-b829-a3e446703cdf","Type":"ContainerDied","Data":"de4fd6bc754c4c551e0d66e96f2c591ac8f20b6b6fc9665f3f7ef82478b1a4e6"} Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.168065 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.194968 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e090fd50-eb0c-4f2c-b829-a3e446703cdf-serving-cert\") pod \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.195024 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-proxy-ca-bundles\") pod \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.195075 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-client-ca\") pod \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.195140 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr58t\" (UniqueName: \"kubernetes.io/projected/e090fd50-eb0c-4f2c-b829-a3e446703cdf-kube-api-access-xr58t\") pod \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.195194 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-config\") pod \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\" (UID: \"e090fd50-eb0c-4f2c-b829-a3e446703cdf\") " Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.195764 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-client-ca" (OuterVolumeSpecName: "client-ca") pod "e090fd50-eb0c-4f2c-b829-a3e446703cdf" (UID: "e090fd50-eb0c-4f2c-b829-a3e446703cdf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.195961 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e090fd50-eb0c-4f2c-b829-a3e446703cdf" (UID: "e090fd50-eb0c-4f2c-b829-a3e446703cdf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.196781 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-config" (OuterVolumeSpecName: "config") pod "e090fd50-eb0c-4f2c-b829-a3e446703cdf" (UID: "e090fd50-eb0c-4f2c-b829-a3e446703cdf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.200996 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e090fd50-eb0c-4f2c-b829-a3e446703cdf-kube-api-access-xr58t" (OuterVolumeSpecName: "kube-api-access-xr58t") pod "e090fd50-eb0c-4f2c-b829-a3e446703cdf" (UID: "e090fd50-eb0c-4f2c-b829-a3e446703cdf"). InnerVolumeSpecName "kube-api-access-xr58t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.203142 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e090fd50-eb0c-4f2c-b829-a3e446703cdf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e090fd50-eb0c-4f2c-b829-a3e446703cdf" (UID: "e090fd50-eb0c-4f2c-b829-a3e446703cdf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.296612 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.296653 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e090fd50-eb0c-4f2c-b829-a3e446703cdf-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.296665 4914 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.296676 4914 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e090fd50-eb0c-4f2c-b829-a3e446703cdf-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.296685 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr58t\" (UniqueName: \"kubernetes.io/projected/e090fd50-eb0c-4f2c-b829-a3e446703cdf-kube-api-access-xr58t\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.344828 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.397766 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dnl4\" (UniqueName: \"kubernetes.io/projected/76382623-8cac-42a6-a9de-436e96a8a153-kube-api-access-8dnl4\") pod \"76382623-8cac-42a6-a9de-436e96a8a153\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.397813 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76382623-8cac-42a6-a9de-436e96a8a153-client-ca\") pod \"76382623-8cac-42a6-a9de-436e96a8a153\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.397837 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76382623-8cac-42a6-a9de-436e96a8a153-config\") pod \"76382623-8cac-42a6-a9de-436e96a8a153\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.397894 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76382623-8cac-42a6-a9de-436e96a8a153-serving-cert\") pod \"76382623-8cac-42a6-a9de-436e96a8a153\" (UID: \"76382623-8cac-42a6-a9de-436e96a8a153\") " Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.398587 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76382623-8cac-42a6-a9de-436e96a8a153-config" (OuterVolumeSpecName: "config") pod "76382623-8cac-42a6-a9de-436e96a8a153" (UID: "76382623-8cac-42a6-a9de-436e96a8a153"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.398584 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76382623-8cac-42a6-a9de-436e96a8a153-client-ca" (OuterVolumeSpecName: "client-ca") pod "76382623-8cac-42a6-a9de-436e96a8a153" (UID: "76382623-8cac-42a6-a9de-436e96a8a153"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.401722 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76382623-8cac-42a6-a9de-436e96a8a153-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "76382623-8cac-42a6-a9de-436e96a8a153" (UID: "76382623-8cac-42a6-a9de-436e96a8a153"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.401762 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76382623-8cac-42a6-a9de-436e96a8a153-kube-api-access-8dnl4" (OuterVolumeSpecName: "kube-api-access-8dnl4") pod "76382623-8cac-42a6-a9de-436e96a8a153" (UID: "76382623-8cac-42a6-a9de-436e96a8a153"). InnerVolumeSpecName "kube-api-access-8dnl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.498809 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76382623-8cac-42a6-a9de-436e96a8a153-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.498841 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dnl4\" (UniqueName: \"kubernetes.io/projected/76382623-8cac-42a6-a9de-436e96a8a153-kube-api-access-8dnl4\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.498852 4914 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76382623-8cac-42a6-a9de-436e96a8a153-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.498861 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76382623-8cac-42a6-a9de-436e96a8a153-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.567716 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" event={"ID":"e090fd50-eb0c-4f2c-b829-a3e446703cdf","Type":"ContainerDied","Data":"af0534ca4ded0c44356556de2201e119dd22a80fb5dcc6d1d1c331f0adc7134c"} Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.567767 4914 scope.go:117] "RemoveContainer" containerID="de4fd6bc754c4c551e0d66e96f2c591ac8f20b6b6fc9665f3f7ef82478b1a4e6" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.567847 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.574136 4914 generic.go:334] "Generic (PLEG): container finished" podID="76382623-8cac-42a6-a9de-436e96a8a153" containerID="c31672a24d5a479b1797fba08654d3bfc374353aecf6b7ea5f8a9bf4eaf5f8bf" exitCode=0 Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.574185 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" event={"ID":"76382623-8cac-42a6-a9de-436e96a8a153","Type":"ContainerDied","Data":"c31672a24d5a479b1797fba08654d3bfc374353aecf6b7ea5f8a9bf4eaf5f8bf"} Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.574208 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.574217 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh" event={"ID":"76382623-8cac-42a6-a9de-436e96a8a153","Type":"ContainerDied","Data":"16c765e7502fd19f52998d5f9440c286b1b824c3aa4d2b41986c9a2824364c9e"} Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.587893 4914 scope.go:117] "RemoveContainer" containerID="c31672a24d5a479b1797fba08654d3bfc374353aecf6b7ea5f8a9bf4eaf5f8bf" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.591862 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr"] Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.594516 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-c5ffdcbcc-4zfnr"] Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.604656 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh"] Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.607718 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6b579757-q6zlh"] Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.608968 4914 scope.go:117] "RemoveContainer" containerID="c31672a24d5a479b1797fba08654d3bfc374353aecf6b7ea5f8a9bf4eaf5f8bf" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.609318 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c31672a24d5a479b1797fba08654d3bfc374353aecf6b7ea5f8a9bf4eaf5f8bf\": container with ID starting with c31672a24d5a479b1797fba08654d3bfc374353aecf6b7ea5f8a9bf4eaf5f8bf not found: ID does not exist" containerID="c31672a24d5a479b1797fba08654d3bfc374353aecf6b7ea5f8a9bf4eaf5f8bf" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.609358 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c31672a24d5a479b1797fba08654d3bfc374353aecf6b7ea5f8a9bf4eaf5f8bf"} err="failed to get container status \"c31672a24d5a479b1797fba08654d3bfc374353aecf6b7ea5f8a9bf4eaf5f8bf\": rpc error: code = NotFound desc = could not find container \"c31672a24d5a479b1797fba08654d3bfc374353aecf6b7ea5f8a9bf4eaf5f8bf\": container with ID starting with c31672a24d5a479b1797fba08654d3bfc374353aecf6b7ea5f8a9bf4eaf5f8bf not found: ID does not exist" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.744991 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75b4f6d956-msm7w"] Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745208 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" containerName="extract-utilities" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745220 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" containerName="extract-utilities" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745230 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" containerName="extract-content" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745236 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" containerName="extract-content" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745247 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b53784-6398-419a-84b0-65f2550636a5" containerName="registry-server" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745254 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b53784-6398-419a-84b0-65f2550636a5" containerName="registry-server" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745261 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" containerName="registry-server" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745270 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" containerName="registry-server" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745282 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e090fd50-eb0c-4f2c-b829-a3e446703cdf" containerName="controller-manager" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745288 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="e090fd50-eb0c-4f2c-b829-a3e446703cdf" containerName="controller-manager" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745296 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfdb54ed-594a-4867-b500-68bdd392ce12" containerName="registry-server" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745302 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfdb54ed-594a-4867-b500-68bdd392ce12" containerName="registry-server" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745311 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfdb54ed-594a-4867-b500-68bdd392ce12" containerName="extract-utilities" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745317 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfdb54ed-594a-4867-b500-68bdd392ce12" containerName="extract-utilities" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745327 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" containerName="extract-content" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745332 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" containerName="extract-content" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745343 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" containerName="registry-server" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745349 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" containerName="registry-server" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745356 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfdb54ed-594a-4867-b500-68bdd392ce12" containerName="extract-content" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745361 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfdb54ed-594a-4867-b500-68bdd392ce12" containerName="extract-content" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745370 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2f6adb-e5cc-43f7-974d-11bae45ddbcc" containerName="marketplace-operator" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745375 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2f6adb-e5cc-43f7-974d-11bae45ddbcc" containerName="marketplace-operator" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745382 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76382623-8cac-42a6-a9de-436e96a8a153" containerName="route-controller-manager" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745387 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="76382623-8cac-42a6-a9de-436e96a8a153" containerName="route-controller-manager" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745396 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" containerName="extract-utilities" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745402 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" containerName="extract-utilities" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745410 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b53784-6398-419a-84b0-65f2550636a5" containerName="extract-utilities" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745415 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b53784-6398-419a-84b0-65f2550636a5" containerName="extract-utilities" Jan 30 21:19:54 crc kubenswrapper[4914]: E0130 21:19:54.745423 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b53784-6398-419a-84b0-65f2550636a5" containerName="extract-content" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745428 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b53784-6398-419a-84b0-65f2550636a5" containerName="extract-content" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745510 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a2f6adb-e5cc-43f7-974d-11bae45ddbcc" containerName="marketplace-operator" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745525 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="76382623-8cac-42a6-a9de-436e96a8a153" containerName="route-controller-manager" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745534 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bb25c2-cc0d-43a1-84ba-9b60c8298acf" containerName="registry-server" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745543 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8b53784-6398-419a-84b0-65f2550636a5" containerName="registry-server" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745555 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e9ae93a-9017-4fbf-aac3-1a1bb8081f6b" containerName="registry-server" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745566 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfdb54ed-594a-4867-b500-68bdd392ce12" containerName="registry-server" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745576 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="e090fd50-eb0c-4f2c-b829-a3e446703cdf" containerName="controller-manager" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.745978 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.747615 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk"] Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.748297 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.749943 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.754608 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.754921 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.755033 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.755132 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75b4f6d956-msm7w"] Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.756124 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.757155 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk"] Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.757175 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.757213 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.757350 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.757633 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.757765 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.757805 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.757889 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.757894 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.802508 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-client-ca\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.802557 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-857cb\" (UniqueName: \"kubernetes.io/projected/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-kube-api-access-857cb\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.802592 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-client-ca\") pod \"route-controller-manager-7d54789467-9zzfk\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.802666 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-config\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.802748 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-serving-cert\") pod \"route-controller-manager-7d54789467-9zzfk\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.802776 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-serving-cert\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.802804 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-proxy-ca-bundles\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.802886 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-config\") pod \"route-controller-manager-7d54789467-9zzfk\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.802998 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk8mn\" (UniqueName: \"kubernetes.io/projected/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-kube-api-access-vk8mn\") pod \"route-controller-manager-7d54789467-9zzfk\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.904036 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-client-ca\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.904097 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-857cb\" (UniqueName: \"kubernetes.io/projected/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-kube-api-access-857cb\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.904155 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-client-ca\") pod \"route-controller-manager-7d54789467-9zzfk\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.904600 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-config\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.904660 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-serving-cert\") pod \"route-controller-manager-7d54789467-9zzfk\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.904685 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-serving-cert\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.904727 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-proxy-ca-bundles\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.904765 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-config\") pod \"route-controller-manager-7d54789467-9zzfk\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.904798 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk8mn\" (UniqueName: \"kubernetes.io/projected/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-kube-api-access-vk8mn\") pod \"route-controller-manager-7d54789467-9zzfk\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.905455 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-client-ca\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.905535 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-client-ca\") pod \"route-controller-manager-7d54789467-9zzfk\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.906412 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-config\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.906676 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-proxy-ca-bundles\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.907486 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-config\") pod \"route-controller-manager-7d54789467-9zzfk\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.910814 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-serving-cert\") pod \"route-controller-manager-7d54789467-9zzfk\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.910892 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-serving-cert\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.919516 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk8mn\" (UniqueName: \"kubernetes.io/projected/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-kube-api-access-vk8mn\") pod \"route-controller-manager-7d54789467-9zzfk\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:54 crc kubenswrapper[4914]: I0130 21:19:54.924095 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-857cb\" (UniqueName: \"kubernetes.io/projected/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-kube-api-access-857cb\") pod \"controller-manager-75b4f6d956-msm7w\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:55 crc kubenswrapper[4914]: I0130 21:19:55.078219 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:55 crc kubenswrapper[4914]: I0130 21:19:55.088326 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:55 crc kubenswrapper[4914]: I0130 21:19:55.474060 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75b4f6d956-msm7w"] Jan 30 21:19:55 crc kubenswrapper[4914]: I0130 21:19:55.544118 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk"] Jan 30 21:19:55 crc kubenswrapper[4914]: W0130 21:19:55.554406 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3526f71_a6ad_452a_a5b7_ae8fcb2325d1.slice/crio-827f61d4e58499f7ae0adf1ca030f4a1d36b3c7ff47c29d76ffff5a624a9f76c WatchSource:0}: Error finding container 827f61d4e58499f7ae0adf1ca030f4a1d36b3c7ff47c29d76ffff5a624a9f76c: Status 404 returned error can't find the container with id 827f61d4e58499f7ae0adf1ca030f4a1d36b3c7ff47c29d76ffff5a624a9f76c Jan 30 21:19:55 crc kubenswrapper[4914]: I0130 21:19:55.580051 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" event={"ID":"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1","Type":"ContainerStarted","Data":"827f61d4e58499f7ae0adf1ca030f4a1d36b3c7ff47c29d76ffff5a624a9f76c"} Jan 30 21:19:55 crc kubenswrapper[4914]: I0130 21:19:55.584352 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" event={"ID":"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c","Type":"ContainerStarted","Data":"383bb07caa29cf52199f862bc3f4225193ca09aa25590760948b28544e5cd136"} Jan 30 21:19:55 crc kubenswrapper[4914]: I0130 21:19:55.824322 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76382623-8cac-42a6-a9de-436e96a8a153" path="/var/lib/kubelet/pods/76382623-8cac-42a6-a9de-436e96a8a153/volumes" Jan 30 21:19:55 crc kubenswrapper[4914]: I0130 21:19:55.825525 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e090fd50-eb0c-4f2c-b829-a3e446703cdf" path="/var/lib/kubelet/pods/e090fd50-eb0c-4f2c-b829-a3e446703cdf/volumes" Jan 30 21:19:56 crc kubenswrapper[4914]: I0130 21:19:56.591744 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" event={"ID":"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c","Type":"ContainerStarted","Data":"f7f60d78bb9b269980cf00627b5070b2e868e07a9d184400a1892194b9ba2e5c"} Jan 30 21:19:56 crc kubenswrapper[4914]: I0130 21:19:56.592017 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:56 crc kubenswrapper[4914]: I0130 21:19:56.593204 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" event={"ID":"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1","Type":"ContainerStarted","Data":"8842cf1c75499103fe6ccbb848c177fb07b3dbaa7e5c1faf5ddbd1dc0d5beb8d"} Jan 30 21:19:56 crc kubenswrapper[4914]: I0130 21:19:56.593811 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:56 crc kubenswrapper[4914]: I0130 21:19:56.598968 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:19:56 crc kubenswrapper[4914]: I0130 21:19:56.603755 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:19:56 crc kubenswrapper[4914]: I0130 21:19:56.616660 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" podStartSLOduration=3.616640253 podStartE2EDuration="3.616640253s" podCreationTimestamp="2026-01-30 21:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:19:56.613141943 +0000 UTC m=+330.051778704" watchObservedRunningTime="2026-01-30 21:19:56.616640253 +0000 UTC m=+330.055277014" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.624875 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" podStartSLOduration=5.624848683 podStartE2EDuration="5.624848683s" podCreationTimestamp="2026-01-30 21:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:19:56.649805783 +0000 UTC m=+330.088442594" watchObservedRunningTime="2026-01-30 21:19:58.624848683 +0000 UTC m=+332.063485454" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.629726 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-485b5"] Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.630898 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-485b5" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.640869 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-485b5"] Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.641360 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.746131 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2847fa80-29b0-4b80-b48b-04661f64dbc7-catalog-content\") pod \"community-operators-485b5\" (UID: \"2847fa80-29b0-4b80-b48b-04661f64dbc7\") " pod="openshift-marketplace/community-operators-485b5" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.746252 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v87g\" (UniqueName: \"kubernetes.io/projected/2847fa80-29b0-4b80-b48b-04661f64dbc7-kube-api-access-6v87g\") pod \"community-operators-485b5\" (UID: \"2847fa80-29b0-4b80-b48b-04661f64dbc7\") " pod="openshift-marketplace/community-operators-485b5" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.746316 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2847fa80-29b0-4b80-b48b-04661f64dbc7-utilities\") pod \"community-operators-485b5\" (UID: \"2847fa80-29b0-4b80-b48b-04661f64dbc7\") " pod="openshift-marketplace/community-operators-485b5" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.826928 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ljlfr"] Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.828068 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.830881 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.834648 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ljlfr"] Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.847503 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v87g\" (UniqueName: \"kubernetes.io/projected/2847fa80-29b0-4b80-b48b-04661f64dbc7-kube-api-access-6v87g\") pod \"community-operators-485b5\" (UID: \"2847fa80-29b0-4b80-b48b-04661f64dbc7\") " pod="openshift-marketplace/community-operators-485b5" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.847552 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx6tq\" (UniqueName: \"kubernetes.io/projected/abd996f5-b265-4424-8c95-b670890abf5d-kube-api-access-vx6tq\") pod \"certified-operators-ljlfr\" (UID: \"abd996f5-b265-4424-8c95-b670890abf5d\") " pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.847581 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2847fa80-29b0-4b80-b48b-04661f64dbc7-utilities\") pod \"community-operators-485b5\" (UID: \"2847fa80-29b0-4b80-b48b-04661f64dbc7\") " pod="openshift-marketplace/community-operators-485b5" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.847597 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abd996f5-b265-4424-8c95-b670890abf5d-utilities\") pod \"certified-operators-ljlfr\" (UID: \"abd996f5-b265-4424-8c95-b670890abf5d\") " pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.847636 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2847fa80-29b0-4b80-b48b-04661f64dbc7-catalog-content\") pod \"community-operators-485b5\" (UID: \"2847fa80-29b0-4b80-b48b-04661f64dbc7\") " pod="openshift-marketplace/community-operators-485b5" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.847657 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abd996f5-b265-4424-8c95-b670890abf5d-catalog-content\") pod \"certified-operators-ljlfr\" (UID: \"abd996f5-b265-4424-8c95-b670890abf5d\") " pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.848195 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2847fa80-29b0-4b80-b48b-04661f64dbc7-utilities\") pod \"community-operators-485b5\" (UID: \"2847fa80-29b0-4b80-b48b-04661f64dbc7\") " pod="openshift-marketplace/community-operators-485b5" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.848407 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2847fa80-29b0-4b80-b48b-04661f64dbc7-catalog-content\") pod \"community-operators-485b5\" (UID: \"2847fa80-29b0-4b80-b48b-04661f64dbc7\") " pod="openshift-marketplace/community-operators-485b5" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.867593 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v87g\" (UniqueName: \"kubernetes.io/projected/2847fa80-29b0-4b80-b48b-04661f64dbc7-kube-api-access-6v87g\") pod \"community-operators-485b5\" (UID: \"2847fa80-29b0-4b80-b48b-04661f64dbc7\") " pod="openshift-marketplace/community-operators-485b5" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.949001 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx6tq\" (UniqueName: \"kubernetes.io/projected/abd996f5-b265-4424-8c95-b670890abf5d-kube-api-access-vx6tq\") pod \"certified-operators-ljlfr\" (UID: \"abd996f5-b265-4424-8c95-b670890abf5d\") " pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.949057 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abd996f5-b265-4424-8c95-b670890abf5d-utilities\") pod \"certified-operators-ljlfr\" (UID: \"abd996f5-b265-4424-8c95-b670890abf5d\") " pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.949108 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abd996f5-b265-4424-8c95-b670890abf5d-catalog-content\") pod \"certified-operators-ljlfr\" (UID: \"abd996f5-b265-4424-8c95-b670890abf5d\") " pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.949813 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abd996f5-b265-4424-8c95-b670890abf5d-catalog-content\") pod \"certified-operators-ljlfr\" (UID: \"abd996f5-b265-4424-8c95-b670890abf5d\") " pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.949926 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abd996f5-b265-4424-8c95-b670890abf5d-utilities\") pod \"certified-operators-ljlfr\" (UID: \"abd996f5-b265-4424-8c95-b670890abf5d\") " pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:19:58 crc kubenswrapper[4914]: I0130 21:19:58.964818 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx6tq\" (UniqueName: \"kubernetes.io/projected/abd996f5-b265-4424-8c95-b670890abf5d-kube-api-access-vx6tq\") pod \"certified-operators-ljlfr\" (UID: \"abd996f5-b265-4424-8c95-b670890abf5d\") " pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:19:59 crc kubenswrapper[4914]: I0130 21:19:59.179682 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-485b5" Jan 30 21:19:59 crc kubenswrapper[4914]: I0130 21:19:59.187045 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:19:59 crc kubenswrapper[4914]: I0130 21:19:59.594749 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-485b5"] Jan 30 21:19:59 crc kubenswrapper[4914]: I0130 21:19:59.614799 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-485b5" event={"ID":"2847fa80-29b0-4b80-b48b-04661f64dbc7","Type":"ContainerStarted","Data":"c7d6ac4cccdbe19d7e1a9a01cba51e387baf1110ea9fab09c749500967c47b5f"} Jan 30 21:19:59 crc kubenswrapper[4914]: I0130 21:19:59.665837 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ljlfr"] Jan 30 21:20:00 crc kubenswrapper[4914]: I0130 21:20:00.622520 4914 generic.go:334] "Generic (PLEG): container finished" podID="abd996f5-b265-4424-8c95-b670890abf5d" containerID="6376db8c6b18e9cf7b3d8bbb043e255a6c7235b5370cf1b846979ea634cb9691" exitCode=0 Jan 30 21:20:00 crc kubenswrapper[4914]: I0130 21:20:00.622568 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljlfr" event={"ID":"abd996f5-b265-4424-8c95-b670890abf5d","Type":"ContainerDied","Data":"6376db8c6b18e9cf7b3d8bbb043e255a6c7235b5370cf1b846979ea634cb9691"} Jan 30 21:20:00 crc kubenswrapper[4914]: I0130 21:20:00.622961 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljlfr" event={"ID":"abd996f5-b265-4424-8c95-b670890abf5d","Type":"ContainerStarted","Data":"d3ea73800d1edc80b30b92449f8ce2c196a8644837c2e9cc17edbd277d570952"} Jan 30 21:20:00 crc kubenswrapper[4914]: I0130 21:20:00.624416 4914 generic.go:334] "Generic (PLEG): container finished" podID="2847fa80-29b0-4b80-b48b-04661f64dbc7" containerID="2e9ea8350f7010d9470711da9ffdc433f25945817b6937ac593dcf7b5ccaa1be" exitCode=0 Jan 30 21:20:00 crc kubenswrapper[4914]: I0130 21:20:00.624442 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-485b5" event={"ID":"2847fa80-29b0-4b80-b48b-04661f64dbc7","Type":"ContainerDied","Data":"2e9ea8350f7010d9470711da9ffdc433f25945817b6937ac593dcf7b5ccaa1be"} Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.031593 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2zb4n"] Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.034587 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.037741 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.039441 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2zb4n"] Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.183878 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870932da-afd7-4695-ab66-3726c700fea4-catalog-content\") pod \"redhat-operators-2zb4n\" (UID: \"870932da-afd7-4695-ab66-3726c700fea4\") " pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.184005 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbsvm\" (UniqueName: \"kubernetes.io/projected/870932da-afd7-4695-ab66-3726c700fea4-kube-api-access-lbsvm\") pod \"redhat-operators-2zb4n\" (UID: \"870932da-afd7-4695-ab66-3726c700fea4\") " pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.184455 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870932da-afd7-4695-ab66-3726c700fea4-utilities\") pod \"redhat-operators-2zb4n\" (UID: \"870932da-afd7-4695-ab66-3726c700fea4\") " pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.227614 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mspfz"] Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.228886 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.231579 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.241171 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mspfz"] Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.285834 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870932da-afd7-4695-ab66-3726c700fea4-utilities\") pod \"redhat-operators-2zb4n\" (UID: \"870932da-afd7-4695-ab66-3726c700fea4\") " pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.285882 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870932da-afd7-4695-ab66-3726c700fea4-catalog-content\") pod \"redhat-operators-2zb4n\" (UID: \"870932da-afd7-4695-ab66-3726c700fea4\") " pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.286134 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbsvm\" (UniqueName: \"kubernetes.io/projected/870932da-afd7-4695-ab66-3726c700fea4-kube-api-access-lbsvm\") pod \"redhat-operators-2zb4n\" (UID: \"870932da-afd7-4695-ab66-3726c700fea4\") " pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.286325 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870932da-afd7-4695-ab66-3726c700fea4-catalog-content\") pod \"redhat-operators-2zb4n\" (UID: \"870932da-afd7-4695-ab66-3726c700fea4\") " pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.286367 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870932da-afd7-4695-ab66-3726c700fea4-utilities\") pod \"redhat-operators-2zb4n\" (UID: \"870932da-afd7-4695-ab66-3726c700fea4\") " pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.307783 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbsvm\" (UniqueName: \"kubernetes.io/projected/870932da-afd7-4695-ab66-3726c700fea4-kube-api-access-lbsvm\") pod \"redhat-operators-2zb4n\" (UID: \"870932da-afd7-4695-ab66-3726c700fea4\") " pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.372572 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.387090 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f146a773-aaa7-4818-bb38-6547863767d5-catalog-content\") pod \"redhat-marketplace-mspfz\" (UID: \"f146a773-aaa7-4818-bb38-6547863767d5\") " pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.387142 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2sdm\" (UniqueName: \"kubernetes.io/projected/f146a773-aaa7-4818-bb38-6547863767d5-kube-api-access-m2sdm\") pod \"redhat-marketplace-mspfz\" (UID: \"f146a773-aaa7-4818-bb38-6547863767d5\") " pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.387183 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f146a773-aaa7-4818-bb38-6547863767d5-utilities\") pod \"redhat-marketplace-mspfz\" (UID: \"f146a773-aaa7-4818-bb38-6547863767d5\") " pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.487835 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f146a773-aaa7-4818-bb38-6547863767d5-utilities\") pod \"redhat-marketplace-mspfz\" (UID: \"f146a773-aaa7-4818-bb38-6547863767d5\") " pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.487919 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f146a773-aaa7-4818-bb38-6547863767d5-catalog-content\") pod \"redhat-marketplace-mspfz\" (UID: \"f146a773-aaa7-4818-bb38-6547863767d5\") " pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.487947 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2sdm\" (UniqueName: \"kubernetes.io/projected/f146a773-aaa7-4818-bb38-6547863767d5-kube-api-access-m2sdm\") pod \"redhat-marketplace-mspfz\" (UID: \"f146a773-aaa7-4818-bb38-6547863767d5\") " pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.488368 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f146a773-aaa7-4818-bb38-6547863767d5-utilities\") pod \"redhat-marketplace-mspfz\" (UID: \"f146a773-aaa7-4818-bb38-6547863767d5\") " pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.488457 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f146a773-aaa7-4818-bb38-6547863767d5-catalog-content\") pod \"redhat-marketplace-mspfz\" (UID: \"f146a773-aaa7-4818-bb38-6547863767d5\") " pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.511324 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2sdm\" (UniqueName: \"kubernetes.io/projected/f146a773-aaa7-4818-bb38-6547863767d5-kube-api-access-m2sdm\") pod \"redhat-marketplace-mspfz\" (UID: \"f146a773-aaa7-4818-bb38-6547863767d5\") " pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.545864 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.840070 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2zb4n"] Jan 30 21:20:01 crc kubenswrapper[4914]: W0130 21:20:01.846740 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod870932da_afd7_4695_ab66_3726c700fea4.slice/crio-9d23d7f1902f3af1c85621c2b3f7fcf11121d9990af27e343728e58534328110 WatchSource:0}: Error finding container 9d23d7f1902f3af1c85621c2b3f7fcf11121d9990af27e343728e58534328110: Status 404 returned error can't find the container with id 9d23d7f1902f3af1c85621c2b3f7fcf11121d9990af27e343728e58534328110 Jan 30 21:20:01 crc kubenswrapper[4914]: I0130 21:20:01.934865 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mspfz"] Jan 30 21:20:01 crc kubenswrapper[4914]: W0130 21:20:01.972101 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf146a773_aaa7_4818_bb38_6547863767d5.slice/crio-d4a3688e0dd370f2a0884d41912a5c947c68f2f9d28c60ae74c94ec16f345a5e WatchSource:0}: Error finding container d4a3688e0dd370f2a0884d41912a5c947c68f2f9d28c60ae74c94ec16f345a5e: Status 404 returned error can't find the container with id d4a3688e0dd370f2a0884d41912a5c947c68f2f9d28c60ae74c94ec16f345a5e Jan 30 21:20:02 crc kubenswrapper[4914]: I0130 21:20:02.653641 4914 generic.go:334] "Generic (PLEG): container finished" podID="2847fa80-29b0-4b80-b48b-04661f64dbc7" containerID="57ebeab9b8140202118a8e6d681682890ff4b16828a9806a5fa2c9ba67ff600e" exitCode=0 Jan 30 21:20:02 crc kubenswrapper[4914]: I0130 21:20:02.654028 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-485b5" event={"ID":"2847fa80-29b0-4b80-b48b-04661f64dbc7","Type":"ContainerDied","Data":"57ebeab9b8140202118a8e6d681682890ff4b16828a9806a5fa2c9ba67ff600e"} Jan 30 21:20:02 crc kubenswrapper[4914]: I0130 21:20:02.656933 4914 generic.go:334] "Generic (PLEG): container finished" podID="abd996f5-b265-4424-8c95-b670890abf5d" containerID="07716fdb83526288e3c9b7ec9bf39a021663cde022f38e2b6faa0a2277d28c76" exitCode=0 Jan 30 21:20:02 crc kubenswrapper[4914]: I0130 21:20:02.657024 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljlfr" event={"ID":"abd996f5-b265-4424-8c95-b670890abf5d","Type":"ContainerDied","Data":"07716fdb83526288e3c9b7ec9bf39a021663cde022f38e2b6faa0a2277d28c76"} Jan 30 21:20:02 crc kubenswrapper[4914]: I0130 21:20:02.658624 4914 generic.go:334] "Generic (PLEG): container finished" podID="f146a773-aaa7-4818-bb38-6547863767d5" containerID="5d2b586e69a89e8b9514153e7950e923eaa3550b8c88f2764d672f9309759c58" exitCode=0 Jan 30 21:20:02 crc kubenswrapper[4914]: I0130 21:20:02.658738 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mspfz" event={"ID":"f146a773-aaa7-4818-bb38-6547863767d5","Type":"ContainerDied","Data":"5d2b586e69a89e8b9514153e7950e923eaa3550b8c88f2764d672f9309759c58"} Jan 30 21:20:02 crc kubenswrapper[4914]: I0130 21:20:02.658773 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mspfz" event={"ID":"f146a773-aaa7-4818-bb38-6547863767d5","Type":"ContainerStarted","Data":"d4a3688e0dd370f2a0884d41912a5c947c68f2f9d28c60ae74c94ec16f345a5e"} Jan 30 21:20:02 crc kubenswrapper[4914]: I0130 21:20:02.662196 4914 generic.go:334] "Generic (PLEG): container finished" podID="870932da-afd7-4695-ab66-3726c700fea4" containerID="1c440a9bcce69e1eeb9840043225eb434532543caa821dd0718f690e2101a97d" exitCode=0 Jan 30 21:20:02 crc kubenswrapper[4914]: I0130 21:20:02.662298 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zb4n" event={"ID":"870932da-afd7-4695-ab66-3726c700fea4","Type":"ContainerDied","Data":"1c440a9bcce69e1eeb9840043225eb434532543caa821dd0718f690e2101a97d"} Jan 30 21:20:02 crc kubenswrapper[4914]: I0130 21:20:02.662347 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zb4n" event={"ID":"870932da-afd7-4695-ab66-3726c700fea4","Type":"ContainerStarted","Data":"9d23d7f1902f3af1c85621c2b3f7fcf11121d9990af27e343728e58534328110"} Jan 30 21:20:04 crc kubenswrapper[4914]: I0130 21:20:04.675180 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljlfr" event={"ID":"abd996f5-b265-4424-8c95-b670890abf5d","Type":"ContainerStarted","Data":"6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982"} Jan 30 21:20:04 crc kubenswrapper[4914]: I0130 21:20:04.698919 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ljlfr" podStartSLOduration=3.745844047 podStartE2EDuration="6.698897567s" podCreationTimestamp="2026-01-30 21:19:58 +0000 UTC" firstStartedPulling="2026-01-30 21:20:00.62400198 +0000 UTC m=+334.062638741" lastFinishedPulling="2026-01-30 21:20:03.5770555 +0000 UTC m=+337.015692261" observedRunningTime="2026-01-30 21:20:04.693661391 +0000 UTC m=+338.132298172" watchObservedRunningTime="2026-01-30 21:20:04.698897567 +0000 UTC m=+338.137534338" Jan 30 21:20:05 crc kubenswrapper[4914]: I0130 21:20:05.683444 4914 generic.go:334] "Generic (PLEG): container finished" podID="f146a773-aaa7-4818-bb38-6547863767d5" containerID="8d250de8e35be4d130e1d0f0cc2161c5b73c70b0f18597e7ebdda91189da4cf2" exitCode=0 Jan 30 21:20:05 crc kubenswrapper[4914]: I0130 21:20:05.683525 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mspfz" event={"ID":"f146a773-aaa7-4818-bb38-6547863767d5","Type":"ContainerDied","Data":"8d250de8e35be4d130e1d0f0cc2161c5b73c70b0f18597e7ebdda91189da4cf2"} Jan 30 21:20:05 crc kubenswrapper[4914]: I0130 21:20:05.687730 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zb4n" event={"ID":"870932da-afd7-4695-ab66-3726c700fea4","Type":"ContainerStarted","Data":"1742624d71e551cca88eb11e3a1ae141dfffe5659545e57533621f2980ee642c"} Jan 30 21:20:05 crc kubenswrapper[4914]: I0130 21:20:05.690233 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-485b5" event={"ID":"2847fa80-29b0-4b80-b48b-04661f64dbc7","Type":"ContainerStarted","Data":"287ab6affd14e6654f724d7ea564ec01d16cd7db2f286e32f58ef0497255a7ef"} Jan 30 21:20:05 crc kubenswrapper[4914]: I0130 21:20:05.719148 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-485b5" podStartSLOduration=3.812253527 podStartE2EDuration="7.71913234s" podCreationTimestamp="2026-01-30 21:19:58 +0000 UTC" firstStartedPulling="2026-01-30 21:20:00.635050756 +0000 UTC m=+334.073687517" lastFinishedPulling="2026-01-30 21:20:04.541929529 +0000 UTC m=+337.980566330" observedRunningTime="2026-01-30 21:20:05.717253991 +0000 UTC m=+339.155890772" watchObservedRunningTime="2026-01-30 21:20:05.71913234 +0000 UTC m=+339.157769101" Jan 30 21:20:06 crc kubenswrapper[4914]: I0130 21:20:06.718869 4914 generic.go:334] "Generic (PLEG): container finished" podID="870932da-afd7-4695-ab66-3726c700fea4" containerID="1742624d71e551cca88eb11e3a1ae141dfffe5659545e57533621f2980ee642c" exitCode=0 Jan 30 21:20:06 crc kubenswrapper[4914]: I0130 21:20:06.718938 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zb4n" event={"ID":"870932da-afd7-4695-ab66-3726c700fea4","Type":"ContainerDied","Data":"1742624d71e551cca88eb11e3a1ae141dfffe5659545e57533621f2980ee642c"} Jan 30 21:20:07 crc kubenswrapper[4914]: I0130 21:20:07.753230 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mspfz" event={"ID":"f146a773-aaa7-4818-bb38-6547863767d5","Type":"ContainerStarted","Data":"cfdb2da91d354580475e61245e6c0491c1c14306e4d190e6ec9a1bd112ba9693"} Jan 30 21:20:07 crc kubenswrapper[4914]: I0130 21:20:07.779615 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mspfz" podStartSLOduration=2.65310647 podStartE2EDuration="6.779558854s" podCreationTimestamp="2026-01-30 21:20:01 +0000 UTC" firstStartedPulling="2026-01-30 21:20:02.660838062 +0000 UTC m=+336.099474843" lastFinishedPulling="2026-01-30 21:20:06.787290426 +0000 UTC m=+340.225927227" observedRunningTime="2026-01-30 21:20:07.777509251 +0000 UTC m=+341.216146022" watchObservedRunningTime="2026-01-30 21:20:07.779558854 +0000 UTC m=+341.218195615" Jan 30 21:20:08 crc kubenswrapper[4914]: I0130 21:20:08.761541 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zb4n" event={"ID":"870932da-afd7-4695-ab66-3726c700fea4","Type":"ContainerStarted","Data":"360280e284da71939b96095f604279922164efe1427e9fd8c1d056f27cd54d3d"} Jan 30 21:20:08 crc kubenswrapper[4914]: I0130 21:20:08.787483 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2zb4n" podStartSLOduration=2.158757527 podStartE2EDuration="7.787466168s" podCreationTimestamp="2026-01-30 21:20:01 +0000 UTC" firstStartedPulling="2026-01-30 21:20:02.665532804 +0000 UTC m=+336.104169585" lastFinishedPulling="2026-01-30 21:20:08.294241465 +0000 UTC m=+341.732878226" observedRunningTime="2026-01-30 21:20:08.784572192 +0000 UTC m=+342.223208963" watchObservedRunningTime="2026-01-30 21:20:08.787466168 +0000 UTC m=+342.226102929" Jan 30 21:20:09 crc kubenswrapper[4914]: I0130 21:20:09.180188 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-485b5" Jan 30 21:20:09 crc kubenswrapper[4914]: I0130 21:20:09.180235 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-485b5" Jan 30 21:20:09 crc kubenswrapper[4914]: I0130 21:20:09.187174 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:20:09 crc kubenswrapper[4914]: I0130 21:20:09.187386 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:20:09 crc kubenswrapper[4914]: I0130 21:20:09.240952 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-485b5" Jan 30 21:20:09 crc kubenswrapper[4914]: I0130 21:20:09.245502 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:20:09 crc kubenswrapper[4914]: I0130 21:20:09.834389 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:20:11 crc kubenswrapper[4914]: I0130 21:20:11.372864 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:11 crc kubenswrapper[4914]: I0130 21:20:11.372924 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:11 crc kubenswrapper[4914]: I0130 21:20:11.546562 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:11 crc kubenswrapper[4914]: I0130 21:20:11.546671 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:11 crc kubenswrapper[4914]: I0130 21:20:11.595841 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:12 crc kubenswrapper[4914]: I0130 21:20:12.434697 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2zb4n" podUID="870932da-afd7-4695-ab66-3726c700fea4" containerName="registry-server" probeResult="failure" output=< Jan 30 21:20:12 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:20:12 crc kubenswrapper[4914]: > Jan 30 21:20:13 crc kubenswrapper[4914]: I0130 21:20:13.434879 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75b4f6d956-msm7w"] Jan 30 21:20:13 crc kubenswrapper[4914]: I0130 21:20:13.435130 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" podUID="13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c" containerName="controller-manager" containerID="cri-o://f7f60d78bb9b269980cf00627b5070b2e868e07a9d184400a1892194b9ba2e5c" gracePeriod=30 Jan 30 21:20:13 crc kubenswrapper[4914]: I0130 21:20:13.639090 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" podUID="10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" containerName="oauth-openshift" containerID="cri-o://b013da72ddeeefda98b6c9fd3fa93acb309c15ad3544ac6f13596f65b59a41b6" gracePeriod=15 Jan 30 21:20:14 crc kubenswrapper[4914]: I0130 21:20:14.798905 4914 generic.go:334] "Generic (PLEG): container finished" podID="10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" containerID="b013da72ddeeefda98b6c9fd3fa93acb309c15ad3544ac6f13596f65b59a41b6" exitCode=0 Jan 30 21:20:14 crc kubenswrapper[4914]: I0130 21:20:14.798994 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" event={"ID":"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66","Type":"ContainerDied","Data":"b013da72ddeeefda98b6c9fd3fa93acb309c15ad3544ac6f13596f65b59a41b6"} Jan 30 21:20:14 crc kubenswrapper[4914]: I0130 21:20:14.800655 4914 generic.go:334] "Generic (PLEG): container finished" podID="13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c" containerID="f7f60d78bb9b269980cf00627b5070b2e868e07a9d184400a1892194b9ba2e5c" exitCode=0 Jan 30 21:20:14 crc kubenswrapper[4914]: I0130 21:20:14.800678 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" event={"ID":"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c","Type":"ContainerDied","Data":"f7f60d78bb9b269980cf00627b5070b2e868e07a9d184400a1892194b9ba2e5c"} Jan 30 21:20:15 crc kubenswrapper[4914]: I0130 21:20:15.079067 4914 patch_prober.go:28] interesting pod/controller-manager-75b4f6d956-msm7w container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Jan 30 21:20:15 crc kubenswrapper[4914]: I0130 21:20:15.079168 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" podUID="13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.044818 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.085805 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-85f4d4654f-rgx2d"] Jan 30 21:20:16 crc kubenswrapper[4914]: E0130 21:20:16.086171 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" containerName="oauth-openshift" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.086198 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" containerName="oauth-openshift" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.086331 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" containerName="oauth-openshift" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.086789 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.103536 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-85f4d4654f-rgx2d"] Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.143141 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.182366 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-login\") pod \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.182429 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-trusted-ca-bundle\") pod \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.182622 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-serving-cert\") pod \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.182988 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-session\") pod \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183321 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-config\") pod \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183367 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-audit-policies\") pod \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183390 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-audit-dir\") pod \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183462 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" (UID: "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183493 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-provider-selection\") pod \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183517 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-router-certs\") pod \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183521 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" (UID: "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183558 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-serving-cert\") pod \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183583 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-error\") pod \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183602 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-ocp-branding-template\") pod \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183616 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-service-ca\") pod \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183659 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-857cb\" (UniqueName: \"kubernetes.io/projected/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-kube-api-access-857cb\") pod \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183683 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-cliconfig\") pod \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183730 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-idp-0-file-data\") pod \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183753 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-client-ca\") pod \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183771 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb92g\" (UniqueName: \"kubernetes.io/projected/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-kube-api-access-qb92g\") pod \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\" (UID: \"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183857 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" (UID: "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.183998 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-router-certs\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184057 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184080 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184106 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184152 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/71d35e8f-b729-4c7e-84a8-5a80299639d4-audit-policies\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184182 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-user-template-error\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184226 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184256 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-user-template-login\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184304 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z25mn\" (UniqueName: \"kubernetes.io/projected/71d35e8f-b729-4c7e-84a8-5a80299639d4-kube-api-access-z25mn\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184331 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71d35e8f-b729-4c7e-84a8-5a80299639d4-audit-dir\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184373 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184389 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" (UID: "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184399 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-session\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184405 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" (UID: "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184437 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-service-ca\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184475 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184520 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184535 4914 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184548 4914 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184558 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184569 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.184596 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-config" (OuterVolumeSpecName: "config") pod "13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c" (UID: "13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.185895 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-client-ca" (OuterVolumeSpecName: "client-ca") pod "13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c" (UID: "13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.191853 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c" (UID: "13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.199361 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-kube-api-access-qb92g" (OuterVolumeSpecName: "kube-api-access-qb92g") pod "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" (UID: "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66"). InnerVolumeSpecName "kube-api-access-qb92g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.199400 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" (UID: "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.199614 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" (UID: "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.201020 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" (UID: "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.201470 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" (UID: "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.201501 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" (UID: "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.201679 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" (UID: "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.201855 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" (UID: "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.205582 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" (UID: "10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.208844 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-kube-api-access-857cb" (OuterVolumeSpecName: "kube-api-access-857cb") pod "13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c" (UID: "13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c"). InnerVolumeSpecName "kube-api-access-857cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.285363 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-proxy-ca-bundles\") pod \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\" (UID: \"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c\") " Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.285656 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-router-certs\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.285769 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.285807 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.285846 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.285886 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/71d35e8f-b729-4c7e-84a8-5a80299639d4-audit-policies\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.285940 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-user-template-error\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.285975 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286022 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-user-template-login\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286065 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z25mn\" (UniqueName: \"kubernetes.io/projected/71d35e8f-b729-4c7e-84a8-5a80299639d4-kube-api-access-z25mn\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286105 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71d35e8f-b729-4c7e-84a8-5a80299639d4-audit-dir\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286137 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286173 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-service-ca\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286205 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-session\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286254 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286335 4914 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286356 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286376 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb92g\" (UniqueName: \"kubernetes.io/projected/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-kube-api-access-qb92g\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286396 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286416 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286437 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286456 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286475 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286493 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286511 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286531 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286550 4914 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286571 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-857cb\" (UniqueName: \"kubernetes.io/projected/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-kube-api-access-857cb\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286566 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71d35e8f-b729-4c7e-84a8-5a80299639d4-audit-dir\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.286906 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c" (UID: "13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.287747 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/71d35e8f-b729-4c7e-84a8-5a80299639d4-audit-policies\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.287963 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.288496 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-service-ca\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.288808 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.290211 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-user-template-error\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.292084 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-user-template-login\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.292260 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.292319 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-router-certs\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.292372 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.292528 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.293550 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.293853 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/71d35e8f-b729-4c7e-84a8-5a80299639d4-v4-0-config-system-session\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.303226 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z25mn\" (UniqueName: \"kubernetes.io/projected/71d35e8f-b729-4c7e-84a8-5a80299639d4-kube-api-access-z25mn\") pod \"oauth-openshift-85f4d4654f-rgx2d\" (UID: \"71d35e8f-b729-4c7e-84a8-5a80299639d4\") " pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.387267 4914 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.405002 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.819747 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" event={"ID":"10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66","Type":"ContainerDied","Data":"7fd27812a3be83870e8661b1959d503dafe4f0afbba0071f8c52204de7da8d82"} Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.819807 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2cd62" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.819886 4914 scope.go:117] "RemoveContainer" containerID="b013da72ddeeefda98b6c9fd3fa93acb309c15ad3544ac6f13596f65b59a41b6" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.822511 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" event={"ID":"13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c","Type":"ContainerDied","Data":"383bb07caa29cf52199f862bc3f4225193ca09aa25590760948b28544e5cd136"} Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.822601 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b4f6d956-msm7w" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.852622 4914 scope.go:117] "RemoveContainer" containerID="f7f60d78bb9b269980cf00627b5070b2e868e07a9d184400a1892194b9ba2e5c" Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.871392 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75b4f6d956-msm7w"] Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.873193 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-75b4f6d956-msm7w"] Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.883233 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2cd62"] Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.889880 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2cd62"] Jan 30 21:20:16 crc kubenswrapper[4914]: I0130 21:20:16.923762 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-85f4d4654f-rgx2d"] Jan 30 21:20:17 crc kubenswrapper[4914]: I0130 21:20:17.851795 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66" path="/var/lib/kubelet/pods/10cfd72c-1e85-4bcf-97ba-d3c5a84dfb66/volumes" Jan 30 21:20:17 crc kubenswrapper[4914]: I0130 21:20:17.853238 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c" path="/var/lib/kubelet/pods/13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c/volumes" Jan 30 21:20:17 crc kubenswrapper[4914]: I0130 21:20:17.854099 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" event={"ID":"71d35e8f-b729-4c7e-84a8-5a80299639d4","Type":"ContainerStarted","Data":"43a55a6ce9477f82c03b352a8f2dcd68e320b15cc9818db34070902667d1cb83"} Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.766399 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7"] Jan 30 21:20:18 crc kubenswrapper[4914]: E0130 21:20:18.766972 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c" containerName="controller-manager" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.766996 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c" containerName="controller-manager" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.767200 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="13087bd1-8b3e-46eb-bfd8-931d1a4d9b4c" containerName="controller-manager" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.767830 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.770523 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.771411 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.775091 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.775344 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.776013 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.776620 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.784033 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.788131 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7"] Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.866944 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" event={"ID":"71d35e8f-b729-4c7e-84a8-5a80299639d4","Type":"ContainerStarted","Data":"edf876c6a83a654584175a70343549ad3b370d0d958a2d7cb51a8125b6f9d6fb"} Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.867785 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.876217 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.900202 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-85f4d4654f-rgx2d" podStartSLOduration=30.900183548 podStartE2EDuration="30.900183548s" podCreationTimestamp="2026-01-30 21:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:20:18.889561273 +0000 UTC m=+352.328198044" watchObservedRunningTime="2026-01-30 21:20:18.900183548 +0000 UTC m=+352.338820309" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.918841 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2240b84d-6f27-4342-b042-0977e70765d8-proxy-ca-bundles\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.918893 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2240b84d-6f27-4342-b042-0977e70765d8-serving-cert\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.918918 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2240b84d-6f27-4342-b042-0977e70765d8-client-ca\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.918945 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4774c\" (UniqueName: \"kubernetes.io/projected/2240b84d-6f27-4342-b042-0977e70765d8-kube-api-access-4774c\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:18 crc kubenswrapper[4914]: I0130 21:20:18.919146 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2240b84d-6f27-4342-b042-0977e70765d8-config\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.020667 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2240b84d-6f27-4342-b042-0977e70765d8-config\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.022117 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2240b84d-6f27-4342-b042-0977e70765d8-config\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.022882 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2240b84d-6f27-4342-b042-0977e70765d8-proxy-ca-bundles\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.022973 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2240b84d-6f27-4342-b042-0977e70765d8-serving-cert\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.022995 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2240b84d-6f27-4342-b042-0977e70765d8-client-ca\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.023025 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4774c\" (UniqueName: \"kubernetes.io/projected/2240b84d-6f27-4342-b042-0977e70765d8-kube-api-access-4774c\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.023649 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2240b84d-6f27-4342-b042-0977e70765d8-proxy-ca-bundles\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.024843 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2240b84d-6f27-4342-b042-0977e70765d8-client-ca\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.031666 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2240b84d-6f27-4342-b042-0977e70765d8-serving-cert\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.048589 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4774c\" (UniqueName: \"kubernetes.io/projected/2240b84d-6f27-4342-b042-0977e70765d8-kube-api-access-4774c\") pod \"controller-manager-c5ffdcbcc-r5wp7\" (UID: \"2240b84d-6f27-4342-b042-0977e70765d8\") " pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.092403 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.248292 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-485b5" Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.336313 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7"] Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.876050 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" event={"ID":"2240b84d-6f27-4342-b042-0977e70765d8","Type":"ContainerStarted","Data":"c14905d0fd8edf76556ae65b4002ae34a4c2fc15da1169036b6a5a4d5b3fc371"} Jan 30 21:20:19 crc kubenswrapper[4914]: I0130 21:20:19.876336 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" event={"ID":"2240b84d-6f27-4342-b042-0977e70765d8","Type":"ContainerStarted","Data":"fbd707cd7041180029b24b5aa326a1611bfd7852507dc8d1592c3a464172c610"} Jan 30 21:20:20 crc kubenswrapper[4914]: I0130 21:20:20.905500 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" podStartSLOduration=7.905479123 podStartE2EDuration="7.905479123s" podCreationTimestamp="2026-01-30 21:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:20:20.900172756 +0000 UTC m=+354.338809547" watchObservedRunningTime="2026-01-30 21:20:20.905479123 +0000 UTC m=+354.344115904" Jan 30 21:20:21 crc kubenswrapper[4914]: I0130 21:20:21.423811 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:21 crc kubenswrapper[4914]: I0130 21:20:21.479219 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2zb4n" Jan 30 21:20:21 crc kubenswrapper[4914]: I0130 21:20:21.600445 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mspfz" Jan 30 21:20:26 crc kubenswrapper[4914]: I0130 21:20:26.983370 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:20:26 crc kubenswrapper[4914]: I0130 21:20:26.983849 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:20:29 crc kubenswrapper[4914]: I0130 21:20:29.093702 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:29 crc kubenswrapper[4914]: I0130 21:20:29.099504 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" Jan 30 21:20:31 crc kubenswrapper[4914]: I0130 21:20:31.052104 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-ljlfr" podUID="abd996f5-b265-4424-8c95-b670890abf5d" containerName="registry-server" probeResult="failure" output=< Jan 30 21:20:31 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:20:31 crc kubenswrapper[4914]: > Jan 30 21:20:31 crc kubenswrapper[4914]: I0130 21:20:31.052596 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-ljlfr" podUID="abd996f5-b265-4424-8c95-b670890abf5d" containerName="registry-server" probeResult="failure" output=< Jan 30 21:20:31 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:20:31 crc kubenswrapper[4914]: > Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.278963 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dq287"] Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.279791 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.301916 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dq287"] Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.404440 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f926a24f-8d1b-48ca-871e-c78312f80df2-bound-sa-token\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.404480 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f926a24f-8d1b-48ca-871e-c78312f80df2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.404518 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f926a24f-8d1b-48ca-871e-c78312f80df2-trusted-ca\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.404536 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f926a24f-8d1b-48ca-871e-c78312f80df2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.404560 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f926a24f-8d1b-48ca-871e-c78312f80df2-registry-certificates\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.404606 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.404628 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f926a24f-8d1b-48ca-871e-c78312f80df2-registry-tls\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.404654 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kj78\" (UniqueName: \"kubernetes.io/projected/f926a24f-8d1b-48ca-871e-c78312f80df2-kube-api-access-6kj78\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.421642 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.505400 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kj78\" (UniqueName: \"kubernetes.io/projected/f926a24f-8d1b-48ca-871e-c78312f80df2-kube-api-access-6kj78\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.505479 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f926a24f-8d1b-48ca-871e-c78312f80df2-bound-sa-token\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.505515 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f926a24f-8d1b-48ca-871e-c78312f80df2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.505543 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f926a24f-8d1b-48ca-871e-c78312f80df2-trusted-ca\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.505566 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f926a24f-8d1b-48ca-871e-c78312f80df2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.505602 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f926a24f-8d1b-48ca-871e-c78312f80df2-registry-certificates\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.505643 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f926a24f-8d1b-48ca-871e-c78312f80df2-registry-tls\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.506831 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f926a24f-8d1b-48ca-871e-c78312f80df2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.508388 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f926a24f-8d1b-48ca-871e-c78312f80df2-trusted-ca\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.509031 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f926a24f-8d1b-48ca-871e-c78312f80df2-registry-certificates\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.512390 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f926a24f-8d1b-48ca-871e-c78312f80df2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.512824 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f926a24f-8d1b-48ca-871e-c78312f80df2-registry-tls\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.524237 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kj78\" (UniqueName: \"kubernetes.io/projected/f926a24f-8d1b-48ca-871e-c78312f80df2-kube-api-access-6kj78\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.527273 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f926a24f-8d1b-48ca-871e-c78312f80df2-bound-sa-token\") pod \"image-registry-66df7c8f76-dq287\" (UID: \"f926a24f-8d1b-48ca-871e-c78312f80df2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:36 crc kubenswrapper[4914]: I0130 21:20:36.593004 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:37 crc kubenswrapper[4914]: I0130 21:20:37.057117 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dq287"] Jan 30 21:20:38 crc kubenswrapper[4914]: I0130 21:20:38.070038 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dq287" event={"ID":"f926a24f-8d1b-48ca-871e-c78312f80df2","Type":"ContainerStarted","Data":"a0915b8ed4a81dbd348f26982695efd5717bf0e3e7868bd4373ddcd9b9d31106"} Jan 30 21:20:38 crc kubenswrapper[4914]: I0130 21:20:38.070101 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dq287" event={"ID":"f926a24f-8d1b-48ca-871e-c78312f80df2","Type":"ContainerStarted","Data":"f3374d9775ac9f0279cb089ccf4838d70d95746f643fdf82c87cde124b5ca6db"} Jan 30 21:20:38 crc kubenswrapper[4914]: I0130 21:20:38.070251 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:38 crc kubenswrapper[4914]: I0130 21:20:38.095536 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-dq287" podStartSLOduration=2.095517881 podStartE2EDuration="2.095517881s" podCreationTimestamp="2026-01-30 21:20:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:20:38.093333854 +0000 UTC m=+371.531970665" watchObservedRunningTime="2026-01-30 21:20:38.095517881 +0000 UTC m=+371.534154642" Jan 30 21:20:53 crc kubenswrapper[4914]: I0130 21:20:53.443935 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk"] Jan 30 21:20:53 crc kubenswrapper[4914]: I0130 21:20:53.444909 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" podUID="c3526f71-a6ad-452a-a5b7-ae8fcb2325d1" containerName="route-controller-manager" containerID="cri-o://8842cf1c75499103fe6ccbb848c177fb07b3dbaa7e5c1faf5ddbd1dc0d5beb8d" gracePeriod=30 Jan 30 21:20:53 crc kubenswrapper[4914]: I0130 21:20:53.961743 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:20:53 crc kubenswrapper[4914]: I0130 21:20:53.981685 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-config\") pod \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " Jan 30 21:20:53 crc kubenswrapper[4914]: I0130 21:20:53.982066 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-client-ca\") pod \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " Jan 30 21:20:53 crc kubenswrapper[4914]: I0130 21:20:53.982242 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-serving-cert\") pod \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " Jan 30 21:20:53 crc kubenswrapper[4914]: I0130 21:20:53.982294 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk8mn\" (UniqueName: \"kubernetes.io/projected/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-kube-api-access-vk8mn\") pod \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\" (UID: \"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1\") " Jan 30 21:20:53 crc kubenswrapper[4914]: I0130 21:20:53.982766 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-client-ca" (OuterVolumeSpecName: "client-ca") pod "c3526f71-a6ad-452a-a5b7-ae8fcb2325d1" (UID: "c3526f71-a6ad-452a-a5b7-ae8fcb2325d1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:20:53 crc kubenswrapper[4914]: I0130 21:20:53.982820 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-config" (OuterVolumeSpecName: "config") pod "c3526f71-a6ad-452a-a5b7-ae8fcb2325d1" (UID: "c3526f71-a6ad-452a-a5b7-ae8fcb2325d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:20:53 crc kubenswrapper[4914]: I0130 21:20:53.989315 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-kube-api-access-vk8mn" (OuterVolumeSpecName: "kube-api-access-vk8mn") pod "c3526f71-a6ad-452a-a5b7-ae8fcb2325d1" (UID: "c3526f71-a6ad-452a-a5b7-ae8fcb2325d1"). InnerVolumeSpecName "kube-api-access-vk8mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:20:53 crc kubenswrapper[4914]: I0130 21:20:53.993263 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c3526f71-a6ad-452a-a5b7-ae8fcb2325d1" (UID: "c3526f71-a6ad-452a-a5b7-ae8fcb2325d1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.088027 4914 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.088086 4914 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.088108 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk8mn\" (UniqueName: \"kubernetes.io/projected/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-kube-api-access-vk8mn\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.088129 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.171055 4914 generic.go:334] "Generic (PLEG): container finished" podID="c3526f71-a6ad-452a-a5b7-ae8fcb2325d1" containerID="8842cf1c75499103fe6ccbb848c177fb07b3dbaa7e5c1faf5ddbd1dc0d5beb8d" exitCode=0 Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.171129 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.171135 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" event={"ID":"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1","Type":"ContainerDied","Data":"8842cf1c75499103fe6ccbb848c177fb07b3dbaa7e5c1faf5ddbd1dc0d5beb8d"} Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.171203 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk" event={"ID":"c3526f71-a6ad-452a-a5b7-ae8fcb2325d1","Type":"ContainerDied","Data":"827f61d4e58499f7ae0adf1ca030f4a1d36b3c7ff47c29d76ffff5a624a9f76c"} Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.171228 4914 scope.go:117] "RemoveContainer" containerID="8842cf1c75499103fe6ccbb848c177fb07b3dbaa7e5c1faf5ddbd1dc0d5beb8d" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.196750 4914 scope.go:117] "RemoveContainer" containerID="8842cf1c75499103fe6ccbb848c177fb07b3dbaa7e5c1faf5ddbd1dc0d5beb8d" Jan 30 21:20:54 crc kubenswrapper[4914]: E0130 21:20:54.197222 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8842cf1c75499103fe6ccbb848c177fb07b3dbaa7e5c1faf5ddbd1dc0d5beb8d\": container with ID starting with 8842cf1c75499103fe6ccbb848c177fb07b3dbaa7e5c1faf5ddbd1dc0d5beb8d not found: ID does not exist" containerID="8842cf1c75499103fe6ccbb848c177fb07b3dbaa7e5c1faf5ddbd1dc0d5beb8d" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.197304 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8842cf1c75499103fe6ccbb848c177fb07b3dbaa7e5c1faf5ddbd1dc0d5beb8d"} err="failed to get container status \"8842cf1c75499103fe6ccbb848c177fb07b3dbaa7e5c1faf5ddbd1dc0d5beb8d\": rpc error: code = NotFound desc = could not find container \"8842cf1c75499103fe6ccbb848c177fb07b3dbaa7e5c1faf5ddbd1dc0d5beb8d\": container with ID starting with 8842cf1c75499103fe6ccbb848c177fb07b3dbaa7e5c1faf5ddbd1dc0d5beb8d not found: ID does not exist" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.211794 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk"] Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.219048 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d54789467-9zzfk"] Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.794620 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw"] Jan 30 21:20:54 crc kubenswrapper[4914]: E0130 21:20:54.795952 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3526f71-a6ad-452a-a5b7-ae8fcb2325d1" containerName="route-controller-manager" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.795977 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3526f71-a6ad-452a-a5b7-ae8fcb2325d1" containerName="route-controller-manager" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.796132 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3526f71-a6ad-452a-a5b7-ae8fcb2325d1" containerName="route-controller-manager" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.796875 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.800479 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.800483 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.800671 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.801184 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.801295 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.801455 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.806748 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e95a881-5fc2-443a-86c9-98017eb613a1-client-ca\") pod \"route-controller-manager-5c6b579757-c5dvw\" (UID: \"3e95a881-5fc2-443a-86c9-98017eb613a1\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.806803 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e95a881-5fc2-443a-86c9-98017eb613a1-config\") pod \"route-controller-manager-5c6b579757-c5dvw\" (UID: \"3e95a881-5fc2-443a-86c9-98017eb613a1\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.806887 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e95a881-5fc2-443a-86c9-98017eb613a1-serving-cert\") pod \"route-controller-manager-5c6b579757-c5dvw\" (UID: \"3e95a881-5fc2-443a-86c9-98017eb613a1\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.806960 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdpbw\" (UniqueName: \"kubernetes.io/projected/3e95a881-5fc2-443a-86c9-98017eb613a1-kube-api-access-mdpbw\") pod \"route-controller-manager-5c6b579757-c5dvw\" (UID: \"3e95a881-5fc2-443a-86c9-98017eb613a1\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.809252 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw"] Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.908650 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e95a881-5fc2-443a-86c9-98017eb613a1-serving-cert\") pod \"route-controller-manager-5c6b579757-c5dvw\" (UID: \"3e95a881-5fc2-443a-86c9-98017eb613a1\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.908790 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdpbw\" (UniqueName: \"kubernetes.io/projected/3e95a881-5fc2-443a-86c9-98017eb613a1-kube-api-access-mdpbw\") pod \"route-controller-manager-5c6b579757-c5dvw\" (UID: \"3e95a881-5fc2-443a-86c9-98017eb613a1\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.908845 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e95a881-5fc2-443a-86c9-98017eb613a1-client-ca\") pod \"route-controller-manager-5c6b579757-c5dvw\" (UID: \"3e95a881-5fc2-443a-86c9-98017eb613a1\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.908867 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e95a881-5fc2-443a-86c9-98017eb613a1-config\") pod \"route-controller-manager-5c6b579757-c5dvw\" (UID: \"3e95a881-5fc2-443a-86c9-98017eb613a1\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.911491 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e95a881-5fc2-443a-86c9-98017eb613a1-client-ca\") pod \"route-controller-manager-5c6b579757-c5dvw\" (UID: \"3e95a881-5fc2-443a-86c9-98017eb613a1\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.911583 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e95a881-5fc2-443a-86c9-98017eb613a1-config\") pod \"route-controller-manager-5c6b579757-c5dvw\" (UID: \"3e95a881-5fc2-443a-86c9-98017eb613a1\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.917012 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e95a881-5fc2-443a-86c9-98017eb613a1-serving-cert\") pod \"route-controller-manager-5c6b579757-c5dvw\" (UID: \"3e95a881-5fc2-443a-86c9-98017eb613a1\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:54 crc kubenswrapper[4914]: I0130 21:20:54.940391 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdpbw\" (UniqueName: \"kubernetes.io/projected/3e95a881-5fc2-443a-86c9-98017eb613a1-kube-api-access-mdpbw\") pod \"route-controller-manager-5c6b579757-c5dvw\" (UID: \"3e95a881-5fc2-443a-86c9-98017eb613a1\") " pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:55 crc kubenswrapper[4914]: I0130 21:20:55.128620 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:55 crc kubenswrapper[4914]: I0130 21:20:55.596648 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw"] Jan 30 21:20:55 crc kubenswrapper[4914]: W0130 21:20:55.601402 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e95a881_5fc2_443a_86c9_98017eb613a1.slice/crio-7b4ea82df6dc130a25727979a96e201fc3d47022f1bef254dd5e2e5da876575a WatchSource:0}: Error finding container 7b4ea82df6dc130a25727979a96e201fc3d47022f1bef254dd5e2e5da876575a: Status 404 returned error can't find the container with id 7b4ea82df6dc130a25727979a96e201fc3d47022f1bef254dd5e2e5da876575a Jan 30 21:20:55 crc kubenswrapper[4914]: I0130 21:20:55.824878 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3526f71-a6ad-452a-a5b7-ae8fcb2325d1" path="/var/lib/kubelet/pods/c3526f71-a6ad-452a-a5b7-ae8fcb2325d1/volumes" Jan 30 21:20:56 crc kubenswrapper[4914]: I0130 21:20:56.186373 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" event={"ID":"3e95a881-5fc2-443a-86c9-98017eb613a1","Type":"ContainerStarted","Data":"6502afbb64263438fdd567002eff3811da6dc55768eaa7ddccb5c93d6488f2cb"} Jan 30 21:20:56 crc kubenswrapper[4914]: I0130 21:20:56.186451 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" event={"ID":"3e95a881-5fc2-443a-86c9-98017eb613a1","Type":"ContainerStarted","Data":"7b4ea82df6dc130a25727979a96e201fc3d47022f1bef254dd5e2e5da876575a"} Jan 30 21:20:56 crc kubenswrapper[4914]: I0130 21:20:56.186773 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:56 crc kubenswrapper[4914]: I0130 21:20:56.195923 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" Jan 30 21:20:56 crc kubenswrapper[4914]: I0130 21:20:56.209868 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c6b579757-c5dvw" podStartSLOduration=3.209842176 podStartE2EDuration="3.209842176s" podCreationTimestamp="2026-01-30 21:20:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:20:56.209457026 +0000 UTC m=+389.648093817" watchObservedRunningTime="2026-01-30 21:20:56.209842176 +0000 UTC m=+389.648478977" Jan 30 21:20:56 crc kubenswrapper[4914]: I0130 21:20:56.600535 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-dq287" Jan 30 21:20:56 crc kubenswrapper[4914]: I0130 21:20:56.668344 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6tfzr"] Jan 30 21:20:56 crc kubenswrapper[4914]: I0130 21:20:56.983869 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:20:56 crc kubenswrapper[4914]: I0130 21:20:56.983953 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:21:21 crc kubenswrapper[4914]: I0130 21:21:21.722464 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" podUID="cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c" containerName="registry" containerID="cri-o://47506032b17868a7229ae1db342767e0af49e33c4c4789c98059aa6f81a2d326" gracePeriod=30 Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.238943 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.342236 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-installation-pull-secrets\") pod \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.342364 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-bound-sa-token\") pod \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.342480 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-trusted-ca\") pod \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.342552 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-ca-trust-extracted\") pod \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.342642 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-registry-tls\") pod \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.342933 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.343004 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-registry-certificates\") pod \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.343069 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr5b6\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-kube-api-access-jr5b6\") pod \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\" (UID: \"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c\") " Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.343463 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.345082 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.349659 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.350088 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.350940 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.358065 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-kube-api-access-jr5b6" (OuterVolumeSpecName: "kube-api-access-jr5b6") pod "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c"). InnerVolumeSpecName "kube-api-access-jr5b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.358884 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.365274 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c" (UID: "cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.384255 4914 generic.go:334] "Generic (PLEG): container finished" podID="cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c" containerID="47506032b17868a7229ae1db342767e0af49e33c4c4789c98059aa6f81a2d326" exitCode=0 Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.384352 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" event={"ID":"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c","Type":"ContainerDied","Data":"47506032b17868a7229ae1db342767e0af49e33c4c4789c98059aa6f81a2d326"} Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.384444 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" event={"ID":"cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c","Type":"ContainerDied","Data":"01b066e3de4e7185a85679efccab79cc6cedd94d4a590d317e82fa5339a3f4e6"} Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.384478 4914 scope.go:117] "RemoveContainer" containerID="47506032b17868a7229ae1db342767e0af49e33c4c4789c98059aa6f81a2d326" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.384512 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6tfzr" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.420429 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6tfzr"] Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.422257 4914 scope.go:117] "RemoveContainer" containerID="47506032b17868a7229ae1db342767e0af49e33c4c4789c98059aa6f81a2d326" Jan 30 21:21:22 crc kubenswrapper[4914]: E0130 21:21:22.422772 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47506032b17868a7229ae1db342767e0af49e33c4c4789c98059aa6f81a2d326\": container with ID starting with 47506032b17868a7229ae1db342767e0af49e33c4c4789c98059aa6f81a2d326 not found: ID does not exist" containerID="47506032b17868a7229ae1db342767e0af49e33c4c4789c98059aa6f81a2d326" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.422811 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47506032b17868a7229ae1db342767e0af49e33c4c4789c98059aa6f81a2d326"} err="failed to get container status \"47506032b17868a7229ae1db342767e0af49e33c4c4789c98059aa6f81a2d326\": rpc error: code = NotFound desc = could not find container \"47506032b17868a7229ae1db342767e0af49e33c4c4789c98059aa6f81a2d326\": container with ID starting with 47506032b17868a7229ae1db342767e0af49e33c4c4789c98059aa6f81a2d326 not found: ID does not exist" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.423574 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6tfzr"] Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.444428 4914 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.444473 4914 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.444491 4914 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.444505 4914 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.444520 4914 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.444535 4914 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:22 crc kubenswrapper[4914]: I0130 21:21:22.444548 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr5b6\" (UniqueName: \"kubernetes.io/projected/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c-kube-api-access-jr5b6\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:23 crc kubenswrapper[4914]: I0130 21:21:23.829378 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c" path="/var/lib/kubelet/pods/cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c/volumes" Jan 30 21:21:26 crc kubenswrapper[4914]: I0130 21:21:26.983355 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:21:26 crc kubenswrapper[4914]: I0130 21:21:26.983797 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:21:26 crc kubenswrapper[4914]: I0130 21:21:26.983840 4914 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:21:26 crc kubenswrapper[4914]: I0130 21:21:26.984337 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2df647095348fd109e6817a5b9226907389cb72479ca19ac34e62f6c888f7739"} pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:21:26 crc kubenswrapper[4914]: I0130 21:21:26.984381 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" containerID="cri-o://2df647095348fd109e6817a5b9226907389cb72479ca19ac34e62f6c888f7739" gracePeriod=600 Jan 30 21:21:27 crc kubenswrapper[4914]: I0130 21:21:27.416824 4914 generic.go:334] "Generic (PLEG): container finished" podID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerID="2df647095348fd109e6817a5b9226907389cb72479ca19ac34e62f6c888f7739" exitCode=0 Jan 30 21:21:27 crc kubenswrapper[4914]: I0130 21:21:27.417240 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerDied","Data":"2df647095348fd109e6817a5b9226907389cb72479ca19ac34e62f6c888f7739"} Jan 30 21:21:27 crc kubenswrapper[4914]: I0130 21:21:27.417289 4914 scope.go:117] "RemoveContainer" containerID="435da81e3258d210f11157ad5d60a9e5edfbde2c9c68db6d72c2f31b11badde4" Jan 30 21:21:28 crc kubenswrapper[4914]: I0130 21:21:28.426470 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"24c5be9264259bf70fbe610f05edf4820e483959d98c60593634eeec5ed85321"} Jan 30 21:23:28 crc kubenswrapper[4914]: I0130 21:23:28.163204 4914 scope.go:117] "RemoveContainer" containerID="6c7673879cc11c7b85f34981f7ca4377f62dc33e66b668449724f01aceffef7f" Jan 30 21:23:28 crc kubenswrapper[4914]: I0130 21:23:28.192297 4914 scope.go:117] "RemoveContainer" containerID="917f63e4bb780d9637d1222a52f82e6541b5d41edd0b80b535d176b614e455b1" Jan 30 21:23:56 crc kubenswrapper[4914]: I0130 21:23:56.983222 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:23:56 crc kubenswrapper[4914]: I0130 21:23:56.984030 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:24:26 crc kubenswrapper[4914]: I0130 21:24:26.983314 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:24:26 crc kubenswrapper[4914]: I0130 21:24:26.985002 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.690905 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb"] Jan 30 21:24:45 crc kubenswrapper[4914]: E0130 21:24:45.691665 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c" containerName="registry" Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.691678 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c" containerName="registry" Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.691798 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd3414f1-ccf6-4e8a-ae56-2de4f2c7be2c" containerName="registry" Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.692616 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.696059 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.711258 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb"] Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.806455 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7133226d-656c-40ba-9d8b-5c0a011efb4b-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb\" (UID: \"7133226d-656c-40ba-9d8b-5c0a011efb4b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.806551 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7133226d-656c-40ba-9d8b-5c0a011efb4b-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb\" (UID: \"7133226d-656c-40ba-9d8b-5c0a011efb4b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.806589 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chrjg\" (UniqueName: \"kubernetes.io/projected/7133226d-656c-40ba-9d8b-5c0a011efb4b-kube-api-access-chrjg\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb\" (UID: \"7133226d-656c-40ba-9d8b-5c0a011efb4b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.908342 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7133226d-656c-40ba-9d8b-5c0a011efb4b-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb\" (UID: \"7133226d-656c-40ba-9d8b-5c0a011efb4b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.908513 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7133226d-656c-40ba-9d8b-5c0a011efb4b-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb\" (UID: \"7133226d-656c-40ba-9d8b-5c0a011efb4b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.908626 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chrjg\" (UniqueName: \"kubernetes.io/projected/7133226d-656c-40ba-9d8b-5c0a011efb4b-kube-api-access-chrjg\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb\" (UID: \"7133226d-656c-40ba-9d8b-5c0a011efb4b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.909333 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7133226d-656c-40ba-9d8b-5c0a011efb4b-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb\" (UID: \"7133226d-656c-40ba-9d8b-5c0a011efb4b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.909662 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7133226d-656c-40ba-9d8b-5c0a011efb4b-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb\" (UID: \"7133226d-656c-40ba-9d8b-5c0a011efb4b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" Jan 30 21:24:45 crc kubenswrapper[4914]: I0130 21:24:45.928582 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chrjg\" (UniqueName: \"kubernetes.io/projected/7133226d-656c-40ba-9d8b-5c0a011efb4b-kube-api-access-chrjg\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb\" (UID: \"7133226d-656c-40ba-9d8b-5c0a011efb4b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" Jan 30 21:24:46 crc kubenswrapper[4914]: I0130 21:24:46.022758 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" Jan 30 21:24:46 crc kubenswrapper[4914]: I0130 21:24:46.253825 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb"] Jan 30 21:24:46 crc kubenswrapper[4914]: I0130 21:24:46.789279 4914 generic.go:334] "Generic (PLEG): container finished" podID="7133226d-656c-40ba-9d8b-5c0a011efb4b" containerID="13c969e53567ff3a5984e714e62aaf9b04c2a5bd5e2cb672f08bf485ce1ebcce" exitCode=0 Jan 30 21:24:46 crc kubenswrapper[4914]: I0130 21:24:46.789357 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" event={"ID":"7133226d-656c-40ba-9d8b-5c0a011efb4b","Type":"ContainerDied","Data":"13c969e53567ff3a5984e714e62aaf9b04c2a5bd5e2cb672f08bf485ce1ebcce"} Jan 30 21:24:46 crc kubenswrapper[4914]: I0130 21:24:46.789604 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" event={"ID":"7133226d-656c-40ba-9d8b-5c0a011efb4b","Type":"ContainerStarted","Data":"69335281cded7bcd90d45234d6cb996a2691b42af6e4ba2b5ef88e245cc43ee2"} Jan 30 21:24:46 crc kubenswrapper[4914]: I0130 21:24:46.791914 4914 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:24:48 crc kubenswrapper[4914]: I0130 21:24:48.805018 4914 generic.go:334] "Generic (PLEG): container finished" podID="7133226d-656c-40ba-9d8b-5c0a011efb4b" containerID="ee257dda5e0a11c0f8c2a65121e152d67c9a072a4130cace7fa0a13dac77fb19" exitCode=0 Jan 30 21:24:48 crc kubenswrapper[4914]: I0130 21:24:48.805169 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" event={"ID":"7133226d-656c-40ba-9d8b-5c0a011efb4b","Type":"ContainerDied","Data":"ee257dda5e0a11c0f8c2a65121e152d67c9a072a4130cace7fa0a13dac77fb19"} Jan 30 21:24:49 crc kubenswrapper[4914]: I0130 21:24:49.815056 4914 generic.go:334] "Generic (PLEG): container finished" podID="7133226d-656c-40ba-9d8b-5c0a011efb4b" containerID="03f5c89e825673579425410a65ca8d14c4db967b19b4e5ccd5523bc73b15226f" exitCode=0 Jan 30 21:24:49 crc kubenswrapper[4914]: I0130 21:24:49.815101 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" event={"ID":"7133226d-656c-40ba-9d8b-5c0a011efb4b","Type":"ContainerDied","Data":"03f5c89e825673579425410a65ca8d14c4db967b19b4e5ccd5523bc73b15226f"} Jan 30 21:24:51 crc kubenswrapper[4914]: I0130 21:24:51.149629 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" Jan 30 21:24:51 crc kubenswrapper[4914]: I0130 21:24:51.281118 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chrjg\" (UniqueName: \"kubernetes.io/projected/7133226d-656c-40ba-9d8b-5c0a011efb4b-kube-api-access-chrjg\") pod \"7133226d-656c-40ba-9d8b-5c0a011efb4b\" (UID: \"7133226d-656c-40ba-9d8b-5c0a011efb4b\") " Jan 30 21:24:51 crc kubenswrapper[4914]: I0130 21:24:51.281377 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7133226d-656c-40ba-9d8b-5c0a011efb4b-util\") pod \"7133226d-656c-40ba-9d8b-5c0a011efb4b\" (UID: \"7133226d-656c-40ba-9d8b-5c0a011efb4b\") " Jan 30 21:24:51 crc kubenswrapper[4914]: I0130 21:24:51.281487 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7133226d-656c-40ba-9d8b-5c0a011efb4b-bundle\") pod \"7133226d-656c-40ba-9d8b-5c0a011efb4b\" (UID: \"7133226d-656c-40ba-9d8b-5c0a011efb4b\") " Jan 30 21:24:51 crc kubenswrapper[4914]: I0130 21:24:51.287336 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7133226d-656c-40ba-9d8b-5c0a011efb4b-bundle" (OuterVolumeSpecName: "bundle") pod "7133226d-656c-40ba-9d8b-5c0a011efb4b" (UID: "7133226d-656c-40ba-9d8b-5c0a011efb4b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:24:51 crc kubenswrapper[4914]: I0130 21:24:51.297091 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7133226d-656c-40ba-9d8b-5c0a011efb4b-kube-api-access-chrjg" (OuterVolumeSpecName: "kube-api-access-chrjg") pod "7133226d-656c-40ba-9d8b-5c0a011efb4b" (UID: "7133226d-656c-40ba-9d8b-5c0a011efb4b"). InnerVolumeSpecName "kube-api-access-chrjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:24:51 crc kubenswrapper[4914]: I0130 21:24:51.304743 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7133226d-656c-40ba-9d8b-5c0a011efb4b-util" (OuterVolumeSpecName: "util") pod "7133226d-656c-40ba-9d8b-5c0a011efb4b" (UID: "7133226d-656c-40ba-9d8b-5c0a011efb4b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:24:51 crc kubenswrapper[4914]: I0130 21:24:51.383183 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chrjg\" (UniqueName: \"kubernetes.io/projected/7133226d-656c-40ba-9d8b-5c0a011efb4b-kube-api-access-chrjg\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:51 crc kubenswrapper[4914]: I0130 21:24:51.383229 4914 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7133226d-656c-40ba-9d8b-5c0a011efb4b-util\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:51 crc kubenswrapper[4914]: I0130 21:24:51.383248 4914 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7133226d-656c-40ba-9d8b-5c0a011efb4b-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:51 crc kubenswrapper[4914]: I0130 21:24:51.835010 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" event={"ID":"7133226d-656c-40ba-9d8b-5c0a011efb4b","Type":"ContainerDied","Data":"69335281cded7bcd90d45234d6cb996a2691b42af6e4ba2b5ef88e245cc43ee2"} Jan 30 21:24:51 crc kubenswrapper[4914]: I0130 21:24:51.835093 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69335281cded7bcd90d45234d6cb996a2691b42af6e4ba2b5ef88e245cc43ee2" Jan 30 21:24:51 crc kubenswrapper[4914]: I0130 21:24:51.835293 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb" Jan 30 21:24:56 crc kubenswrapper[4914]: I0130 21:24:56.930138 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hchqc"] Jan 30 21:24:56 crc kubenswrapper[4914]: I0130 21:24:56.930906 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovn-controller" containerID="cri-o://27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75" gracePeriod=30 Jan 30 21:24:56 crc kubenswrapper[4914]: I0130 21:24:56.931040 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="northd" containerID="cri-o://a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6" gracePeriod=30 Jan 30 21:24:56 crc kubenswrapper[4914]: I0130 21:24:56.931005 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="nbdb" containerID="cri-o://d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d" gracePeriod=30 Jan 30 21:24:56 crc kubenswrapper[4914]: I0130 21:24:56.931119 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovn-acl-logging" containerID="cri-o://27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad" gracePeriod=30 Jan 30 21:24:56 crc kubenswrapper[4914]: I0130 21:24:56.931117 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e" gracePeriod=30 Jan 30 21:24:56 crc kubenswrapper[4914]: I0130 21:24:56.931122 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="kube-rbac-proxy-node" containerID="cri-o://1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a" gracePeriod=30 Jan 30 21:24:56 crc kubenswrapper[4914]: I0130 21:24:56.931031 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="sbdb" containerID="cri-o://0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412" gracePeriod=30 Jan 30 21:24:56 crc kubenswrapper[4914]: I0130 21:24:56.983471 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:24:56 crc kubenswrapper[4914]: I0130 21:24:56.983829 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:24:56 crc kubenswrapper[4914]: I0130 21:24:56.983881 4914 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:24:56 crc kubenswrapper[4914]: I0130 21:24:56.984428 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" containerID="cri-o://d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635" gracePeriod=30 Jan 30 21:24:56 crc kubenswrapper[4914]: I0130 21:24:56.984577 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"24c5be9264259bf70fbe610f05edf4820e483959d98c60593634eeec5ed85321"} pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:24:56 crc kubenswrapper[4914]: I0130 21:24:56.984649 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" containerID="cri-o://24c5be9264259bf70fbe610f05edf4820e483959d98c60593634eeec5ed85321" gracePeriod=600 Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.283035 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/3.log" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.285698 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovn-acl-logging/0.log" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.286390 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovn-controller/0.log" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.286799 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367003 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rkzbm"] Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.367319 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367345 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.367361 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovn-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367372 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovn-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.367390 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367401 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.367412 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7133226d-656c-40ba-9d8b-5c0a011efb4b" containerName="util" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367423 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="7133226d-656c-40ba-9d8b-5c0a011efb4b" containerName="util" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.367433 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7133226d-656c-40ba-9d8b-5c0a011efb4b" containerName="pull" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367442 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="7133226d-656c-40ba-9d8b-5c0a011efb4b" containerName="pull" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.367458 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="kubecfg-setup" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367468 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="kubecfg-setup" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.367483 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="northd" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367494 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="northd" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.367506 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="nbdb" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367516 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="nbdb" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.367525 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367536 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.367551 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="sbdb" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367561 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="sbdb" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.367576 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7133226d-656c-40ba-9d8b-5c0a011efb4b" containerName="extract" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367586 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="7133226d-656c-40ba-9d8b-5c0a011efb4b" containerName="extract" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.367603 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367614 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.367636 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="kube-rbac-proxy-node" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367647 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="kube-rbac-proxy-node" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.367662 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovn-acl-logging" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367671 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovn-acl-logging" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367846 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367862 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="kube-rbac-proxy-node" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367876 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovn-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367887 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="nbdb" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367898 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367909 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovn-acl-logging" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367923 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367938 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="sbdb" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367955 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="northd" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367971 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.367985 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="7133226d-656c-40ba-9d8b-5c0a011efb4b" containerName="extract" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.368132 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.368146 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.368318 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.368334 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.368505 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.368527 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerName="ovnkube-controller" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.371093 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.464878 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-systemd-units\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.464943 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-run-netns\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.464964 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-slash\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.464992 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-cni-netd\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465014 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-cni-bin\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465037 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-etc-openvswitch\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465108 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r27rl\" (UniqueName: \"kubernetes.io/projected/6a32fa1f-f3a9-4e60-b665-51138c3ce768-kube-api-access-r27rl\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465138 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovn-node-metrics-cert\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465169 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-ovn\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465211 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-env-overrides\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465232 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-systemd\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465253 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-kubelet\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465299 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovnkube-script-lib\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465322 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-log-socket\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465343 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-run-ovn-kubernetes\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465365 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-openvswitch\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465414 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovnkube-config\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465465 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-node-log\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465484 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-var-lib-cni-networks-ovn-kubernetes\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465505 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-var-lib-openvswitch\") pod \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\" (UID: \"6a32fa1f-f3a9-4e60-b665-51138c3ce768\") " Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465655 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-env-overrides\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465687 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-etc-openvswitch\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465737 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-cni-bin\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465761 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-slash\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465783 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465810 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-cni-netd\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465836 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-run-ovn\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465861 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-run-systemd\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465882 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-log-socket\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465904 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-ovnkube-config\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465931 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsqqj\" (UniqueName: \"kubernetes.io/projected/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-kube-api-access-vsqqj\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465952 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-run-netns\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465976 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-systemd-units\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.465999 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-run-openvswitch\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466036 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-ovnkube-script-lib\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466064 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-var-lib-openvswitch\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466091 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-node-log\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466130 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-kubelet\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466155 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-ovn-node-metrics-cert\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466179 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466284 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466315 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466338 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-slash" (OuterVolumeSpecName: "host-slash") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466359 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466383 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466407 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466899 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466931 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466953 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-log-socket" (OuterVolumeSpecName: "log-socket") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466985 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.466996 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.467031 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-node-log" (OuterVolumeSpecName: "node-log") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.467065 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.467098 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.467342 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.467354 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.467448 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.472136 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.478295 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a32fa1f-f3a9-4e60-b665-51138c3ce768-kube-api-access-r27rl" (OuterVolumeSpecName: "kube-api-access-r27rl") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "kube-api-access-r27rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.485489 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "6a32fa1f-f3a9-4e60-b665-51138c3ce768" (UID: "6a32fa1f-f3a9-4e60-b665-51138c3ce768"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.567385 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-node-log\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.567452 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-kubelet\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.567477 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-ovn-node-metrics-cert\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.567498 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.567526 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-env-overrides\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.567549 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-etc-openvswitch\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.567572 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-cni-bin\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.567592 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-slash\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.567600 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.567614 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.567665 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-kubelet\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.567546 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-node-log\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568158 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568211 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-cni-bin\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568161 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-cni-netd\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568289 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-run-ovn\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568332 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-cni-netd\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568340 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-run-ovn\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568396 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-run-systemd\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568422 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-log-socket\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568437 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-env-overrides\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568467 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-run-systemd\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568482 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-log-socket\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568494 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-ovnkube-config\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568522 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsqqj\" (UniqueName: \"kubernetes.io/projected/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-kube-api-access-vsqqj\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568695 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-run-netns\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568793 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-etc-openvswitch\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568814 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-run-netns\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.569053 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-ovnkube-config\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.568838 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-host-slash\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.569204 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-systemd-units\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.569325 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-run-openvswitch\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.569489 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-ovnkube-script-lib\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.569630 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-var-lib-openvswitch\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.569852 4914 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.569963 4914 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.570069 4914 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.570170 4914 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.570275 4914 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-log-socket\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.570370 4914 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.570463 4914 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.570561 4914 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.570647 4914 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-node-log\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.570742 4914 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.570828 4914 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.570912 4914 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.570991 4914 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-slash\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.571071 4914 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.571159 4914 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.571250 4914 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.571339 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r27rl\" (UniqueName: \"kubernetes.io/projected/6a32fa1f-f3a9-4e60-b665-51138c3ce768-kube-api-access-r27rl\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.571419 4914 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.571498 4914 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a32fa1f-f3a9-4e60-b665-51138c3ce768-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.571577 4914 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a32fa1f-f3a9-4e60-b665-51138c3ce768-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.569413 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-run-openvswitch\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.569690 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-var-lib-openvswitch\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.570027 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-ovnkube-script-lib\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.569235 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-systemd-units\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.572619 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-ovn-node-metrics-cert\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.602062 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsqqj\" (UniqueName: \"kubernetes.io/projected/474ac6a3-28e8-4643-a16c-0f218b6f4f1e-kube-api-access-vsqqj\") pod \"ovnkube-node-rkzbm\" (UID: \"474ac6a3-28e8-4643-a16c-0f218b6f4f1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.685559 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:24:57 crc kubenswrapper[4914]: W0130 21:24:57.703842 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod474ac6a3_28e8_4643_a16c_0f218b6f4f1e.slice/crio-1df8c487fd18e0abd6bceb3bbe884b68629e68b5f8d2e394464b20a1c4b59663 WatchSource:0}: Error finding container 1df8c487fd18e0abd6bceb3bbe884b68629e68b5f8d2e394464b20a1c4b59663: Status 404 returned error can't find the container with id 1df8c487fd18e0abd6bceb3bbe884b68629e68b5f8d2e394464b20a1c4b59663 Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.878162 4914 generic.go:334] "Generic (PLEG): container finished" podID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerID="24c5be9264259bf70fbe610f05edf4820e483959d98c60593634eeec5ed85321" exitCode=0 Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.878227 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerDied","Data":"24c5be9264259bf70fbe610f05edf4820e483959d98c60593634eeec5ed85321"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.878535 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"e121058e768dda1d14fe4563b4b94e4252170909803ddfd6651100686fef20ef"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.878557 4914 scope.go:117] "RemoveContainer" containerID="2df647095348fd109e6817a5b9226907389cb72479ca19ac34e62f6c888f7739" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.882971 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovnkube-controller/3.log" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.886296 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovn-acl-logging/0.log" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.886765 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hchqc_6a32fa1f-f3a9-4e60-b665-51138c3ce768/ovn-controller/0.log" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887084 4914 generic.go:334] "Generic (PLEG): container finished" podID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerID="d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635" exitCode=0 Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887107 4914 generic.go:334] "Generic (PLEG): container finished" podID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerID="0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412" exitCode=0 Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887115 4914 generic.go:334] "Generic (PLEG): container finished" podID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerID="d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d" exitCode=0 Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887123 4914 generic.go:334] "Generic (PLEG): container finished" podID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerID="a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6" exitCode=0 Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887129 4914 generic.go:334] "Generic (PLEG): container finished" podID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerID="9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e" exitCode=0 Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887137 4914 generic.go:334] "Generic (PLEG): container finished" podID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerID="1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a" exitCode=0 Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887144 4914 generic.go:334] "Generic (PLEG): container finished" podID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerID="27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad" exitCode=143 Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887151 4914 generic.go:334] "Generic (PLEG): container finished" podID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" containerID="27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75" exitCode=143 Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887206 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerDied","Data":"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887239 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerDied","Data":"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887254 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerDied","Data":"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887264 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerDied","Data":"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887268 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887272 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerDied","Data":"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887649 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerDied","Data":"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887664 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887696 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887717 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887723 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887728 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887734 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887739 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887744 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887750 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887761 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887840 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerDied","Data":"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887852 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887859 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887865 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887871 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887877 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887884 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887890 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.887896 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888027 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888035 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888043 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerDied","Data":"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888052 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888058 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888064 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888075 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888081 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888087 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888093 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888098 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888104 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888109 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888116 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hchqc" event={"ID":"6a32fa1f-f3a9-4e60-b665-51138c3ce768","Type":"ContainerDied","Data":"0e42913ac7520e395fbd2e1ca6c66a93240d33e4f47675aea6fe9241c8efc3aa"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888123 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888130 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888135 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888141 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888146 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888151 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888156 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888161 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888167 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.888174 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.889016 4914 generic.go:334] "Generic (PLEG): container finished" podID="474ac6a3-28e8-4643-a16c-0f218b6f4f1e" containerID="29f8a160fbd94117f33793eedb341bfcae0427270d3bb970338226f8a34cb6b9" exitCode=0 Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.889073 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" event={"ID":"474ac6a3-28e8-4643-a16c-0f218b6f4f1e","Type":"ContainerDied","Data":"29f8a160fbd94117f33793eedb341bfcae0427270d3bb970338226f8a34cb6b9"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.889104 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" event={"ID":"474ac6a3-28e8-4643-a16c-0f218b6f4f1e","Type":"ContainerStarted","Data":"1df8c487fd18e0abd6bceb3bbe884b68629e68b5f8d2e394464b20a1c4b59663"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.890564 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvbd7_c1067fc5-9bff-4a81-982f-b2cca1c432d0/kube-multus/2.log" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.890931 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvbd7_c1067fc5-9bff-4a81-982f-b2cca1c432d0/kube-multus/1.log" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.890961 4914 generic.go:334] "Generic (PLEG): container finished" podID="c1067fc5-9bff-4a81-982f-b2cca1c432d0" containerID="c2b89d677b10a0c9096fdbb15c317ca43c6c9d680a668ca53e06449829acfd01" exitCode=2 Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.890981 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvbd7" event={"ID":"c1067fc5-9bff-4a81-982f-b2cca1c432d0","Type":"ContainerDied","Data":"c2b89d677b10a0c9096fdbb15c317ca43c6c9d680a668ca53e06449829acfd01"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.891000 4914 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"556e77646daeedff4e7f95f018b7c7bec78863ade5c39385eb31ec26341e4d7d"} Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.891270 4914 scope.go:117] "RemoveContainer" containerID="c2b89d677b10a0c9096fdbb15c317ca43c6c9d680a668ca53e06449829acfd01" Jan 30 21:24:57 crc kubenswrapper[4914]: E0130 21:24:57.891405 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wvbd7_openshift-multus(c1067fc5-9bff-4a81-982f-b2cca1c432d0)\"" pod="openshift-multus/multus-wvbd7" podUID="c1067fc5-9bff-4a81-982f-b2cca1c432d0" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.925507 4914 scope.go:117] "RemoveContainer" containerID="d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.954948 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hchqc"] Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.959933 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hchqc"] Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.966134 4914 scope.go:117] "RemoveContainer" containerID="11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0" Jan 30 21:24:57 crc kubenswrapper[4914]: I0130 21:24:57.996211 4914 scope.go:117] "RemoveContainer" containerID="0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.012632 4914 scope.go:117] "RemoveContainer" containerID="d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.027446 4914 scope.go:117] "RemoveContainer" containerID="a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.047825 4914 scope.go:117] "RemoveContainer" containerID="9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.064077 4914 scope.go:117] "RemoveContainer" containerID="1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.100942 4914 scope.go:117] "RemoveContainer" containerID="27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.126014 4914 scope.go:117] "RemoveContainer" containerID="27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.155418 4914 scope.go:117] "RemoveContainer" containerID="1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.176610 4914 scope.go:117] "RemoveContainer" containerID="d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635" Jan 30 21:24:58 crc kubenswrapper[4914]: E0130 21:24:58.176875 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635\": container with ID starting with d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635 not found: ID does not exist" containerID="d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.176905 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635"} err="failed to get container status \"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635\": rpc error: code = NotFound desc = could not find container \"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635\": container with ID starting with d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.176926 4914 scope.go:117] "RemoveContainer" containerID="11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0" Jan 30 21:24:58 crc kubenswrapper[4914]: E0130 21:24:58.177224 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\": container with ID starting with 11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0 not found: ID does not exist" containerID="11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.177243 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0"} err="failed to get container status \"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\": rpc error: code = NotFound desc = could not find container \"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\": container with ID starting with 11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.177255 4914 scope.go:117] "RemoveContainer" containerID="0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412" Jan 30 21:24:58 crc kubenswrapper[4914]: E0130 21:24:58.177453 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\": container with ID starting with 0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412 not found: ID does not exist" containerID="0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.177470 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412"} err="failed to get container status \"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\": rpc error: code = NotFound desc = could not find container \"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\": container with ID starting with 0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.177482 4914 scope.go:117] "RemoveContainer" containerID="d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d" Jan 30 21:24:58 crc kubenswrapper[4914]: E0130 21:24:58.177774 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\": container with ID starting with d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d not found: ID does not exist" containerID="d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.177795 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d"} err="failed to get container status \"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\": rpc error: code = NotFound desc = could not find container \"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\": container with ID starting with d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.177809 4914 scope.go:117] "RemoveContainer" containerID="a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6" Jan 30 21:24:58 crc kubenswrapper[4914]: E0130 21:24:58.177969 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\": container with ID starting with a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6 not found: ID does not exist" containerID="a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.177988 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6"} err="failed to get container status \"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\": rpc error: code = NotFound desc = could not find container \"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\": container with ID starting with a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.178002 4914 scope.go:117] "RemoveContainer" containerID="9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e" Jan 30 21:24:58 crc kubenswrapper[4914]: E0130 21:24:58.178269 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\": container with ID starting with 9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e not found: ID does not exist" containerID="9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.178291 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e"} err="failed to get container status \"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\": rpc error: code = NotFound desc = could not find container \"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\": container with ID starting with 9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.178304 4914 scope.go:117] "RemoveContainer" containerID="1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a" Jan 30 21:24:58 crc kubenswrapper[4914]: E0130 21:24:58.178497 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\": container with ID starting with 1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a not found: ID does not exist" containerID="1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.178515 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a"} err="failed to get container status \"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\": rpc error: code = NotFound desc = could not find container \"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\": container with ID starting with 1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.178527 4914 scope.go:117] "RemoveContainer" containerID="27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad" Jan 30 21:24:58 crc kubenswrapper[4914]: E0130 21:24:58.178800 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\": container with ID starting with 27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad not found: ID does not exist" containerID="27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.178819 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad"} err="failed to get container status \"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\": rpc error: code = NotFound desc = could not find container \"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\": container with ID starting with 27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.178833 4914 scope.go:117] "RemoveContainer" containerID="27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75" Jan 30 21:24:58 crc kubenswrapper[4914]: E0130 21:24:58.179009 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\": container with ID starting with 27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75 not found: ID does not exist" containerID="27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.179029 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75"} err="failed to get container status \"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\": rpc error: code = NotFound desc = could not find container \"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\": container with ID starting with 27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.179041 4914 scope.go:117] "RemoveContainer" containerID="1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18" Jan 30 21:24:58 crc kubenswrapper[4914]: E0130 21:24:58.179322 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\": container with ID starting with 1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18 not found: ID does not exist" containerID="1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.179341 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18"} err="failed to get container status \"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\": rpc error: code = NotFound desc = could not find container \"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\": container with ID starting with 1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.179353 4914 scope.go:117] "RemoveContainer" containerID="d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.179570 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635"} err="failed to get container status \"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635\": rpc error: code = NotFound desc = could not find container \"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635\": container with ID starting with d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.179588 4914 scope.go:117] "RemoveContainer" containerID="11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.180366 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0"} err="failed to get container status \"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\": rpc error: code = NotFound desc = could not find container \"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\": container with ID starting with 11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.180384 4914 scope.go:117] "RemoveContainer" containerID="0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.180591 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412"} err="failed to get container status \"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\": rpc error: code = NotFound desc = could not find container \"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\": container with ID starting with 0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.180609 4914 scope.go:117] "RemoveContainer" containerID="d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.180916 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d"} err="failed to get container status \"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\": rpc error: code = NotFound desc = could not find container \"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\": container with ID starting with d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.180934 4914 scope.go:117] "RemoveContainer" containerID="a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.181131 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6"} err="failed to get container status \"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\": rpc error: code = NotFound desc = could not find container \"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\": container with ID starting with a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.181147 4914 scope.go:117] "RemoveContainer" containerID="9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.181388 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e"} err="failed to get container status \"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\": rpc error: code = NotFound desc = could not find container \"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\": container with ID starting with 9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.181406 4914 scope.go:117] "RemoveContainer" containerID="1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.181571 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a"} err="failed to get container status \"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\": rpc error: code = NotFound desc = could not find container \"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\": container with ID starting with 1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.181586 4914 scope.go:117] "RemoveContainer" containerID="27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.181871 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad"} err="failed to get container status \"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\": rpc error: code = NotFound desc = could not find container \"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\": container with ID starting with 27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.181887 4914 scope.go:117] "RemoveContainer" containerID="27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.183909 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75"} err="failed to get container status \"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\": rpc error: code = NotFound desc = could not find container \"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\": container with ID starting with 27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.183964 4914 scope.go:117] "RemoveContainer" containerID="1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.185732 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18"} err="failed to get container status \"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\": rpc error: code = NotFound desc = could not find container \"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\": container with ID starting with 1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.185755 4914 scope.go:117] "RemoveContainer" containerID="d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.186021 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635"} err="failed to get container status \"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635\": rpc error: code = NotFound desc = could not find container \"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635\": container with ID starting with d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.186039 4914 scope.go:117] "RemoveContainer" containerID="11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.186250 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0"} err="failed to get container status \"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\": rpc error: code = NotFound desc = could not find container \"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\": container with ID starting with 11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.186269 4914 scope.go:117] "RemoveContainer" containerID="0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.186535 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412"} err="failed to get container status \"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\": rpc error: code = NotFound desc = could not find container \"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\": container with ID starting with 0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.186572 4914 scope.go:117] "RemoveContainer" containerID="d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.186853 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d"} err="failed to get container status \"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\": rpc error: code = NotFound desc = could not find container \"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\": container with ID starting with d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.186910 4914 scope.go:117] "RemoveContainer" containerID="a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.187435 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6"} err="failed to get container status \"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\": rpc error: code = NotFound desc = could not find container \"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\": container with ID starting with a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.187482 4914 scope.go:117] "RemoveContainer" containerID="9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.187692 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e"} err="failed to get container status \"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\": rpc error: code = NotFound desc = could not find container \"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\": container with ID starting with 9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.187749 4914 scope.go:117] "RemoveContainer" containerID="1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.187932 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a"} err="failed to get container status \"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\": rpc error: code = NotFound desc = could not find container \"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\": container with ID starting with 1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.187949 4914 scope.go:117] "RemoveContainer" containerID="27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.188168 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad"} err="failed to get container status \"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\": rpc error: code = NotFound desc = could not find container \"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\": container with ID starting with 27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.188194 4914 scope.go:117] "RemoveContainer" containerID="27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.188404 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75"} err="failed to get container status \"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\": rpc error: code = NotFound desc = could not find container \"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\": container with ID starting with 27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.188424 4914 scope.go:117] "RemoveContainer" containerID="1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.188629 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18"} err="failed to get container status \"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\": rpc error: code = NotFound desc = could not find container \"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\": container with ID starting with 1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.188653 4914 scope.go:117] "RemoveContainer" containerID="d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.188842 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635"} err="failed to get container status \"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635\": rpc error: code = NotFound desc = could not find container \"d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635\": container with ID starting with d5c8733abd74fdd76e47876ae91b2dc2379b1758cd42f5ce7e217d17f047c635 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.188862 4914 scope.go:117] "RemoveContainer" containerID="11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.189109 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0"} err="failed to get container status \"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\": rpc error: code = NotFound desc = could not find container \"11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0\": container with ID starting with 11bd5408d1e8d0a28e145b5c4b4c8862d03fb2615771823e9162225727ec11a0 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.189132 4914 scope.go:117] "RemoveContainer" containerID="0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.189324 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412"} err="failed to get container status \"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\": rpc error: code = NotFound desc = could not find container \"0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412\": container with ID starting with 0d751f7366588b92f9e1b45da0e9ba81d995a32b60055643bb73264a65812412 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.189341 4914 scope.go:117] "RemoveContainer" containerID="d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.189630 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d"} err="failed to get container status \"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\": rpc error: code = NotFound desc = could not find container \"d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d\": container with ID starting with d39eac0410166480a4cafe51991f4b9f79d6242ca08849d9d1e945bfe37eff8d not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.189648 4914 scope.go:117] "RemoveContainer" containerID="a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.190888 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6"} err="failed to get container status \"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\": rpc error: code = NotFound desc = could not find container \"a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6\": container with ID starting with a7ac9ea26e8f9c415aa4275670740ce058254f88fa7bc4087a45963db6f1eae6 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.190906 4914 scope.go:117] "RemoveContainer" containerID="9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.191144 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e"} err="failed to get container status \"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\": rpc error: code = NotFound desc = could not find container \"9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e\": container with ID starting with 9675032df222b2dd9cd105875454cae29fefbcd941520a9e710b111babd8f79e not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.191161 4914 scope.go:117] "RemoveContainer" containerID="1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.191340 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a"} err="failed to get container status \"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\": rpc error: code = NotFound desc = could not find container \"1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a\": container with ID starting with 1f62fa9937bd0a5e3700c2cb30f35aa33de6dbd83fe7fcc050c4e3914375f54a not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.191377 4914 scope.go:117] "RemoveContainer" containerID="27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.191537 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad"} err="failed to get container status \"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\": rpc error: code = NotFound desc = could not find container \"27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad\": container with ID starting with 27e87072dc648d5d11e281760ebff8345c51a49aa9be363c7db753c274d477ad not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.191554 4914 scope.go:117] "RemoveContainer" containerID="27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.191730 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75"} err="failed to get container status \"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\": rpc error: code = NotFound desc = could not find container \"27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75\": container with ID starting with 27acc9a7dc2e39d37a6d1de3835a52a781f6a2589f4b8a72b0b78bc27e24cc75 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.191748 4914 scope.go:117] "RemoveContainer" containerID="1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.191896 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18"} err="failed to get container status \"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\": rpc error: code = NotFound desc = could not find container \"1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18\": container with ID starting with 1a08e13b820843b37d468de639c334c4beb47146194566829a1b31d9e7b6ba18 not found: ID does not exist" Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.899477 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" event={"ID":"474ac6a3-28e8-4643-a16c-0f218b6f4f1e","Type":"ContainerStarted","Data":"3212b48158ba215db00ab5bd81eeb0cb1ea9e805e38bc634c9391f27853d80db"} Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.899955 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" event={"ID":"474ac6a3-28e8-4643-a16c-0f218b6f4f1e","Type":"ContainerStarted","Data":"130f3fa286b06b05936794dd0b34197adbcec48b0decb0e9edfa14d574ce05ed"} Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.900020 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" event={"ID":"474ac6a3-28e8-4643-a16c-0f218b6f4f1e","Type":"ContainerStarted","Data":"eedc8b9c89a6600a20cc1fbd39286c47e4ccf1c65f78b9451e66e599787ec474"} Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.900079 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" event={"ID":"474ac6a3-28e8-4643-a16c-0f218b6f4f1e","Type":"ContainerStarted","Data":"e66566a538b815742b9a10cef98b3c2dabbcba092b701a57e6218aa128f9c865"} Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.900137 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" event={"ID":"474ac6a3-28e8-4643-a16c-0f218b6f4f1e","Type":"ContainerStarted","Data":"3b9eef4d245b01335499153784dfb1b333ed359e06909a5301ac41c6bac4e336"} Jan 30 21:24:58 crc kubenswrapper[4914]: I0130 21:24:58.900195 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" event={"ID":"474ac6a3-28e8-4643-a16c-0f218b6f4f1e","Type":"ContainerStarted","Data":"0f04bf39c9ce3a70eda070d627baffeb9984bb1c252bf2383cb8a387f37e9a56"} Jan 30 21:24:59 crc kubenswrapper[4914]: I0130 21:24:59.825076 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a32fa1f-f3a9-4e60-b665-51138c3ce768" path="/var/lib/kubelet/pods/6a32fa1f-f3a9-4e60-b665-51138c3ce768/volumes" Jan 30 21:25:01 crc kubenswrapper[4914]: I0130 21:25:01.921516 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" event={"ID":"474ac6a3-28e8-4643-a16c-0f218b6f4f1e","Type":"ContainerStarted","Data":"4bf0d89d59251216fcdf19189f618229b3fe28dfdb0c1560da897374ba23cf30"} Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.347820 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9752c"] Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.348595 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.351001 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-klj4z" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.352422 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.352935 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.439887 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcs6s\" (UniqueName: \"kubernetes.io/projected/1633e963-9082-4659-af26-20bc3b1e512b-kube-api-access-gcs6s\") pod \"obo-prometheus-operator-68bc856cb9-9752c\" (UID: \"1633e963-9082-4659-af26-20bc3b1e512b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.471373 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb"] Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.472124 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.473634 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-t7ln6" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.473632 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.481318 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz"] Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.481966 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.541166 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcs6s\" (UniqueName: \"kubernetes.io/projected/1633e963-9082-4659-af26-20bc3b1e512b-kube-api-access-gcs6s\") pod \"obo-prometheus-operator-68bc856cb9-9752c\" (UID: \"1633e963-9082-4659-af26-20bc3b1e512b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.561950 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcs6s\" (UniqueName: \"kubernetes.io/projected/1633e963-9082-4659-af26-20bc3b1e512b-kube-api-access-gcs6s\") pod \"obo-prometheus-operator-68bc856cb9-9752c\" (UID: \"1633e963-9082-4659-af26-20bc3b1e512b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.642753 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb\" (UID: \"0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.642835 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb\" (UID: \"0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.642881 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/128f28df-6fdd-4a2c-86af-6dfe33baf2c9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz\" (UID: \"128f28df-6fdd-4a2c-86af-6dfe33baf2c9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.642901 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/128f28df-6fdd-4a2c-86af-6dfe33baf2c9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz\" (UID: \"128f28df-6fdd-4a2c-86af-6dfe33baf2c9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.665618 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.689116 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-gf6nm"] Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.689915 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.691798 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-nfzgj" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.691824 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 30 21:25:03 crc kubenswrapper[4914]: E0130 21:25:03.692164 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9752c_openshift-operators_1633e963-9082-4659-af26-20bc3b1e512b_0(29997c1ac49da3fb01772471f7d631b8bab464fbe48261a765d2aee106c9d244): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:03 crc kubenswrapper[4914]: E0130 21:25:03.692213 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9752c_openshift-operators_1633e963-9082-4659-af26-20bc3b1e512b_0(29997c1ac49da3fb01772471f7d631b8bab464fbe48261a765d2aee106c9d244): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:03 crc kubenswrapper[4914]: E0130 21:25:03.692232 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9752c_openshift-operators_1633e963-9082-4659-af26-20bc3b1e512b_0(29997c1ac49da3fb01772471f7d631b8bab464fbe48261a765d2aee106c9d244): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:03 crc kubenswrapper[4914]: E0130 21:25:03.692269 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-9752c_openshift-operators(1633e963-9082-4659-af26-20bc3b1e512b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-9752c_openshift-operators(1633e963-9082-4659-af26-20bc3b1e512b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9752c_openshift-operators_1633e963-9082-4659-af26-20bc3b1e512b_0(29997c1ac49da3fb01772471f7d631b8bab464fbe48261a765d2aee106c9d244): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" podUID="1633e963-9082-4659-af26-20bc3b1e512b" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.744050 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb\" (UID: \"0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.744117 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb\" (UID: \"0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.744153 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/128f28df-6fdd-4a2c-86af-6dfe33baf2c9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz\" (UID: \"128f28df-6fdd-4a2c-86af-6dfe33baf2c9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.744168 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/128f28df-6fdd-4a2c-86af-6dfe33baf2c9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz\" (UID: \"128f28df-6fdd-4a2c-86af-6dfe33baf2c9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.747550 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/128f28df-6fdd-4a2c-86af-6dfe33baf2c9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz\" (UID: \"128f28df-6fdd-4a2c-86af-6dfe33baf2c9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.747875 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb\" (UID: \"0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.749506 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/128f28df-6fdd-4a2c-86af-6dfe33baf2c9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz\" (UID: \"128f28df-6fdd-4a2c-86af-6dfe33baf2c9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.750370 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb\" (UID: \"0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.794919 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-6c2k7"] Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.795568 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.795751 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.798215 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-9t52g" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.805873 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:03 crc kubenswrapper[4914]: E0130 21:25:03.829499 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators_0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae_0(84926ec14af9fd642733476c6c807996ca39cf285db1a868f697660e0cad9683): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:03 crc kubenswrapper[4914]: E0130 21:25:03.829586 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators_0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae_0(84926ec14af9fd642733476c6c807996ca39cf285db1a868f697660e0cad9683): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:03 crc kubenswrapper[4914]: E0130 21:25:03.829611 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators_0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae_0(84926ec14af9fd642733476c6c807996ca39cf285db1a868f697660e0cad9683): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:03 crc kubenswrapper[4914]: E0130 21:25:03.829696 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators(0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators(0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators_0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae_0(84926ec14af9fd642733476c6c807996ca39cf285db1a868f697660e0cad9683): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" podUID="0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae" Jan 30 21:25:03 crc kubenswrapper[4914]: E0130 21:25:03.842523 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators_128f28df-6fdd-4a2c-86af-6dfe33baf2c9_0(ebf962de7c78a9a161e80182f8d58729fe3a6451558c7fe780d9256a602d27dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:03 crc kubenswrapper[4914]: E0130 21:25:03.842585 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators_128f28df-6fdd-4a2c-86af-6dfe33baf2c9_0(ebf962de7c78a9a161e80182f8d58729fe3a6451558c7fe780d9256a602d27dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:03 crc kubenswrapper[4914]: E0130 21:25:03.842604 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators_128f28df-6fdd-4a2c-86af-6dfe33baf2c9_0(ebf962de7c78a9a161e80182f8d58729fe3a6451558c7fe780d9256a602d27dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:03 crc kubenswrapper[4914]: E0130 21:25:03.842643 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators(128f28df-6fdd-4a2c-86af-6dfe33baf2c9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators(128f28df-6fdd-4a2c-86af-6dfe33baf2c9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators_128f28df-6fdd-4a2c-86af-6dfe33baf2c9_0(ebf962de7c78a9a161e80182f8d58729fe3a6451558c7fe780d9256a602d27dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" podUID="128f28df-6fdd-4a2c-86af-6dfe33baf2c9" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.847449 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f547924-f70d-41f4-8461-57953f81d9ac-observability-operator-tls\") pod \"observability-operator-59bdc8b94-gf6nm\" (UID: \"0f547924-f70d-41f4-8461-57953f81d9ac\") " pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.847502 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ffxb\" (UniqueName: \"kubernetes.io/projected/0f547924-f70d-41f4-8461-57953f81d9ac-kube-api-access-5ffxb\") pod \"observability-operator-59bdc8b94-gf6nm\" (UID: \"0f547924-f70d-41f4-8461-57953f81d9ac\") " pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.941777 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" event={"ID":"474ac6a3-28e8-4643-a16c-0f218b6f4f1e","Type":"ContainerStarted","Data":"ce6ddb4b423254d18cdee8e270267c25eb871389d0cd58fc2e6a840c602d2cbd"} Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.942239 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.942286 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.948982 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b8e78e3-d709-4289-b9aa-15a6270a66d0-openshift-service-ca\") pod \"perses-operator-5bf474d74f-6c2k7\" (UID: \"1b8e78e3-d709-4289-b9aa-15a6270a66d0\") " pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.949052 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d72lh\" (UniqueName: \"kubernetes.io/projected/1b8e78e3-d709-4289-b9aa-15a6270a66d0-kube-api-access-d72lh\") pod \"perses-operator-5bf474d74f-6c2k7\" (UID: \"1b8e78e3-d709-4289-b9aa-15a6270a66d0\") " pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.949172 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f547924-f70d-41f4-8461-57953f81d9ac-observability-operator-tls\") pod \"observability-operator-59bdc8b94-gf6nm\" (UID: \"0f547924-f70d-41f4-8461-57953f81d9ac\") " pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.949267 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ffxb\" (UniqueName: \"kubernetes.io/projected/0f547924-f70d-41f4-8461-57953f81d9ac-kube-api-access-5ffxb\") pod \"observability-operator-59bdc8b94-gf6nm\" (UID: \"0f547924-f70d-41f4-8461-57953f81d9ac\") " pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.953342 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f547924-f70d-41f4-8461-57953f81d9ac-observability-operator-tls\") pod \"observability-operator-59bdc8b94-gf6nm\" (UID: \"0f547924-f70d-41f4-8461-57953f81d9ac\") " pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.969627 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ffxb\" (UniqueName: \"kubernetes.io/projected/0f547924-f70d-41f4-8461-57953f81d9ac-kube-api-access-5ffxb\") pod \"observability-operator-59bdc8b94-gf6nm\" (UID: \"0f547924-f70d-41f4-8461-57953f81d9ac\") " pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:03 crc kubenswrapper[4914]: I0130 21:25:03.979297 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" podStartSLOduration=6.979280799 podStartE2EDuration="6.979280799s" podCreationTimestamp="2026-01-30 21:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:25:03.978385737 +0000 UTC m=+637.417022498" watchObservedRunningTime="2026-01-30 21:25:03.979280799 +0000 UTC m=+637.417917560" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.003261 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.016220 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.050415 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d72lh\" (UniqueName: \"kubernetes.io/projected/1b8e78e3-d709-4289-b9aa-15a6270a66d0-kube-api-access-d72lh\") pod \"perses-operator-5bf474d74f-6c2k7\" (UID: \"1b8e78e3-d709-4289-b9aa-15a6270a66d0\") " pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.050545 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b8e78e3-d709-4289-b9aa-15a6270a66d0-openshift-service-ca\") pod \"perses-operator-5bf474d74f-6c2k7\" (UID: \"1b8e78e3-d709-4289-b9aa-15a6270a66d0\") " pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.051605 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b8e78e3-d709-4289-b9aa-15a6270a66d0-openshift-service-ca\") pod \"perses-operator-5bf474d74f-6c2k7\" (UID: \"1b8e78e3-d709-4289-b9aa-15a6270a66d0\") " pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.057103 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-gf6nm_openshift-operators_0f547924-f70d-41f4-8461-57953f81d9ac_0(93de9edce96c55563839875b00dd30cbf2d7c7edd273f2925eb259232b38186f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.057164 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-gf6nm_openshift-operators_0f547924-f70d-41f4-8461-57953f81d9ac_0(93de9edce96c55563839875b00dd30cbf2d7c7edd273f2925eb259232b38186f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.057185 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-gf6nm_openshift-operators_0f547924-f70d-41f4-8461-57953f81d9ac_0(93de9edce96c55563839875b00dd30cbf2d7c7edd273f2925eb259232b38186f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.057219 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-gf6nm_openshift-operators(0f547924-f70d-41f4-8461-57953f81d9ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-gf6nm_openshift-operators(0f547924-f70d-41f4-8461-57953f81d9ac)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-gf6nm_openshift-operators_0f547924-f70d-41f4-8461-57953f81d9ac_0(93de9edce96c55563839875b00dd30cbf2d7c7edd273f2925eb259232b38186f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" podUID="0f547924-f70d-41f4-8461-57953f81d9ac" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.074518 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d72lh\" (UniqueName: \"kubernetes.io/projected/1b8e78e3-d709-4289-b9aa-15a6270a66d0-kube-api-access-d72lh\") pod \"perses-operator-5bf474d74f-6c2k7\" (UID: \"1b8e78e3-d709-4289-b9aa-15a6270a66d0\") " pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.116139 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.140755 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(f403558e2022eb88bf157720ff06a1898af7c568d7171fed59f193a95f92eee8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.140819 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(f403558e2022eb88bf157720ff06a1898af7c568d7171fed59f193a95f92eee8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.140841 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(f403558e2022eb88bf157720ff06a1898af7c568d7171fed59f193a95f92eee8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.140885 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-6c2k7_openshift-operators(1b8e78e3-d709-4289-b9aa-15a6270a66d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-6c2k7_openshift-operators(1b8e78e3-d709-4289-b9aa-15a6270a66d0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(f403558e2022eb88bf157720ff06a1898af7c568d7171fed59f193a95f92eee8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" podUID="1b8e78e3-d709-4289-b9aa-15a6270a66d0" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.187223 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-6c2k7"] Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.205742 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-gf6nm"] Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.209836 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz"] Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.209976 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.210414 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.213992 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9752c"] Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.214129 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.214562 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.231470 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb"] Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.231577 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.231977 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.233100 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators_128f28df-6fdd-4a2c-86af-6dfe33baf2c9_0(298955d53281c11231c835b2304e33be65c2f91b84e8ae952e38e0aa4cd178eb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.233165 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators_128f28df-6fdd-4a2c-86af-6dfe33baf2c9_0(298955d53281c11231c835b2304e33be65c2f91b84e8ae952e38e0aa4cd178eb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.233189 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators_128f28df-6fdd-4a2c-86af-6dfe33baf2c9_0(298955d53281c11231c835b2304e33be65c2f91b84e8ae952e38e0aa4cd178eb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.233237 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators(128f28df-6fdd-4a2c-86af-6dfe33baf2c9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators(128f28df-6fdd-4a2c-86af-6dfe33baf2c9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators_128f28df-6fdd-4a2c-86af-6dfe33baf2c9_0(298955d53281c11231c835b2304e33be65c2f91b84e8ae952e38e0aa4cd178eb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" podUID="128f28df-6fdd-4a2c-86af-6dfe33baf2c9" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.241456 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9752c_openshift-operators_1633e963-9082-4659-af26-20bc3b1e512b_0(bda6b108b76b42412af1bc7c20a7b6c002895c0f67188d630dff7aa7e56cda95): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.241530 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9752c_openshift-operators_1633e963-9082-4659-af26-20bc3b1e512b_0(bda6b108b76b42412af1bc7c20a7b6c002895c0f67188d630dff7aa7e56cda95): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.241570 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9752c_openshift-operators_1633e963-9082-4659-af26-20bc3b1e512b_0(bda6b108b76b42412af1bc7c20a7b6c002895c0f67188d630dff7aa7e56cda95): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.241625 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-9752c_openshift-operators(1633e963-9082-4659-af26-20bc3b1e512b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-9752c_openshift-operators(1633e963-9082-4659-af26-20bc3b1e512b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9752c_openshift-operators_1633e963-9082-4659-af26-20bc3b1e512b_0(bda6b108b76b42412af1bc7c20a7b6c002895c0f67188d630dff7aa7e56cda95): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" podUID="1633e963-9082-4659-af26-20bc3b1e512b" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.266351 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators_0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae_0(548d9edacbb949e6e3b9170808add249cc8507c2abfb3925ad8eaec4e9ec6a68): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.266426 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators_0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae_0(548d9edacbb949e6e3b9170808add249cc8507c2abfb3925ad8eaec4e9ec6a68): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.266455 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators_0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae_0(548d9edacbb949e6e3b9170808add249cc8507c2abfb3925ad8eaec4e9ec6a68): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:04 crc kubenswrapper[4914]: E0130 21:25:04.266510 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators(0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators(0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators_0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae_0(548d9edacbb949e6e3b9170808add249cc8507c2abfb3925ad8eaec4e9ec6a68): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" podUID="0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.946484 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.946495 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.946917 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.947050 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:04 crc kubenswrapper[4914]: I0130 21:25:04.947646 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:25:05 crc kubenswrapper[4914]: E0130 21:25:05.008727 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(ac6cc220419746743c242bb3e2bb3750e08ecd1f0ba8f39a447a62a6ff14386c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:05 crc kubenswrapper[4914]: E0130 21:25:05.008821 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(ac6cc220419746743c242bb3e2bb3750e08ecd1f0ba8f39a447a62a6ff14386c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:05 crc kubenswrapper[4914]: E0130 21:25:05.008850 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(ac6cc220419746743c242bb3e2bb3750e08ecd1f0ba8f39a447a62a6ff14386c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:05 crc kubenswrapper[4914]: E0130 21:25:05.008897 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-6c2k7_openshift-operators(1b8e78e3-d709-4289-b9aa-15a6270a66d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-6c2k7_openshift-operators(1b8e78e3-d709-4289-b9aa-15a6270a66d0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(ac6cc220419746743c242bb3e2bb3750e08ecd1f0ba8f39a447a62a6ff14386c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" podUID="1b8e78e3-d709-4289-b9aa-15a6270a66d0" Jan 30 21:25:05 crc kubenswrapper[4914]: E0130 21:25:05.012163 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-gf6nm_openshift-operators_0f547924-f70d-41f4-8461-57953f81d9ac_0(4c9a89072092981a11b5484dd2c007f36d0ac0a9c94a16814674a892fcfc547f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:05 crc kubenswrapper[4914]: E0130 21:25:05.012201 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-gf6nm_openshift-operators_0f547924-f70d-41f4-8461-57953f81d9ac_0(4c9a89072092981a11b5484dd2c007f36d0ac0a9c94a16814674a892fcfc547f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:05 crc kubenswrapper[4914]: E0130 21:25:05.012221 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-gf6nm_openshift-operators_0f547924-f70d-41f4-8461-57953f81d9ac_0(4c9a89072092981a11b5484dd2c007f36d0ac0a9c94a16814674a892fcfc547f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:05 crc kubenswrapper[4914]: E0130 21:25:05.012258 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-gf6nm_openshift-operators(0f547924-f70d-41f4-8461-57953f81d9ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-gf6nm_openshift-operators(0f547924-f70d-41f4-8461-57953f81d9ac)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-gf6nm_openshift-operators_0f547924-f70d-41f4-8461-57953f81d9ac_0(4c9a89072092981a11b5484dd2c007f36d0ac0a9c94a16814674a892fcfc547f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" podUID="0f547924-f70d-41f4-8461-57953f81d9ac" Jan 30 21:25:05 crc kubenswrapper[4914]: I0130 21:25:05.016116 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:25:12 crc kubenswrapper[4914]: I0130 21:25:12.818959 4914 scope.go:117] "RemoveContainer" containerID="c2b89d677b10a0c9096fdbb15c317ca43c6c9d680a668ca53e06449829acfd01" Jan 30 21:25:12 crc kubenswrapper[4914]: E0130 21:25:12.820502 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wvbd7_openshift-multus(c1067fc5-9bff-4a81-982f-b2cca1c432d0)\"" pod="openshift-multus/multus-wvbd7" podUID="c1067fc5-9bff-4a81-982f-b2cca1c432d0" Jan 30 21:25:15 crc kubenswrapper[4914]: I0130 21:25:15.817028 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:15 crc kubenswrapper[4914]: I0130 21:25:15.817065 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:15 crc kubenswrapper[4914]: I0130 21:25:15.817747 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:15 crc kubenswrapper[4914]: I0130 21:25:15.817938 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:15 crc kubenswrapper[4914]: E0130 21:25:15.867405 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9752c_openshift-operators_1633e963-9082-4659-af26-20bc3b1e512b_0(e1f3d443968573945832a90c3785323ea24ec63287f52d9f9d5e2af6806ed357): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:15 crc kubenswrapper[4914]: E0130 21:25:15.867459 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9752c_openshift-operators_1633e963-9082-4659-af26-20bc3b1e512b_0(e1f3d443968573945832a90c3785323ea24ec63287f52d9f9d5e2af6806ed357): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:15 crc kubenswrapper[4914]: E0130 21:25:15.867480 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9752c_openshift-operators_1633e963-9082-4659-af26-20bc3b1e512b_0(e1f3d443968573945832a90c3785323ea24ec63287f52d9f9d5e2af6806ed357): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:15 crc kubenswrapper[4914]: E0130 21:25:15.867518 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-9752c_openshift-operators(1633e963-9082-4659-af26-20bc3b1e512b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-9752c_openshift-operators(1633e963-9082-4659-af26-20bc3b1e512b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9752c_openshift-operators_1633e963-9082-4659-af26-20bc3b1e512b_0(e1f3d443968573945832a90c3785323ea24ec63287f52d9f9d5e2af6806ed357): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" podUID="1633e963-9082-4659-af26-20bc3b1e512b" Jan 30 21:25:15 crc kubenswrapper[4914]: E0130 21:25:15.874386 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(139f6f45af6a7d07ad697d0df1d666d21275e02e640e291cf588c3887f4b73fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:15 crc kubenswrapper[4914]: E0130 21:25:15.874424 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(139f6f45af6a7d07ad697d0df1d666d21275e02e640e291cf588c3887f4b73fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:15 crc kubenswrapper[4914]: E0130 21:25:15.874441 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(139f6f45af6a7d07ad697d0df1d666d21275e02e640e291cf588c3887f4b73fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:15 crc kubenswrapper[4914]: E0130 21:25:15.874472 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-6c2k7_openshift-operators(1b8e78e3-d709-4289-b9aa-15a6270a66d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-6c2k7_openshift-operators(1b8e78e3-d709-4289-b9aa-15a6270a66d0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(139f6f45af6a7d07ad697d0df1d666d21275e02e640e291cf588c3887f4b73fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" podUID="1b8e78e3-d709-4289-b9aa-15a6270a66d0" Jan 30 21:25:16 crc kubenswrapper[4914]: I0130 21:25:16.817289 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:16 crc kubenswrapper[4914]: I0130 21:25:16.818013 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:16 crc kubenswrapper[4914]: E0130 21:25:16.855486 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-gf6nm_openshift-operators_0f547924-f70d-41f4-8461-57953f81d9ac_0(5fb65964224b06ccbd0efa1d1104476f262b09174a5ce6de94e3b42d77b316e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:16 crc kubenswrapper[4914]: E0130 21:25:16.855563 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-gf6nm_openshift-operators_0f547924-f70d-41f4-8461-57953f81d9ac_0(5fb65964224b06ccbd0efa1d1104476f262b09174a5ce6de94e3b42d77b316e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:16 crc kubenswrapper[4914]: E0130 21:25:16.855593 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-gf6nm_openshift-operators_0f547924-f70d-41f4-8461-57953f81d9ac_0(5fb65964224b06ccbd0efa1d1104476f262b09174a5ce6de94e3b42d77b316e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:16 crc kubenswrapper[4914]: E0130 21:25:16.855650 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-gf6nm_openshift-operators(0f547924-f70d-41f4-8461-57953f81d9ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-gf6nm_openshift-operators(0f547924-f70d-41f4-8461-57953f81d9ac)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-gf6nm_openshift-operators_0f547924-f70d-41f4-8461-57953f81d9ac_0(5fb65964224b06ccbd0efa1d1104476f262b09174a5ce6de94e3b42d77b316e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" podUID="0f547924-f70d-41f4-8461-57953f81d9ac" Jan 30 21:25:17 crc kubenswrapper[4914]: I0130 21:25:17.817204 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:17 crc kubenswrapper[4914]: I0130 21:25:17.823767 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:17 crc kubenswrapper[4914]: E0130 21:25:17.862299 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators_128f28df-6fdd-4a2c-86af-6dfe33baf2c9_0(79a8242f8edc8ae8edb5737f8dc3dd9416be2d52feef7144e853546bf24477b0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:17 crc kubenswrapper[4914]: E0130 21:25:17.862446 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators_128f28df-6fdd-4a2c-86af-6dfe33baf2c9_0(79a8242f8edc8ae8edb5737f8dc3dd9416be2d52feef7144e853546bf24477b0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:17 crc kubenswrapper[4914]: E0130 21:25:17.862504 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators_128f28df-6fdd-4a2c-86af-6dfe33baf2c9_0(79a8242f8edc8ae8edb5737f8dc3dd9416be2d52feef7144e853546bf24477b0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:17 crc kubenswrapper[4914]: E0130 21:25:17.862643 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators(128f28df-6fdd-4a2c-86af-6dfe33baf2c9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators(128f28df-6fdd-4a2c-86af-6dfe33baf2c9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_openshift-operators_128f28df-6fdd-4a2c-86af-6dfe33baf2c9_0(79a8242f8edc8ae8edb5737f8dc3dd9416be2d52feef7144e853546bf24477b0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" podUID="128f28df-6fdd-4a2c-86af-6dfe33baf2c9" Jan 30 21:25:19 crc kubenswrapper[4914]: I0130 21:25:19.817569 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:19 crc kubenswrapper[4914]: I0130 21:25:19.818067 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:19 crc kubenswrapper[4914]: E0130 21:25:19.838889 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators_0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae_0(26f4df32362c14872123102ce5d5ac5dcfd70999c46d21f001d5b026ec84a44d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:19 crc kubenswrapper[4914]: E0130 21:25:19.838969 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators_0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae_0(26f4df32362c14872123102ce5d5ac5dcfd70999c46d21f001d5b026ec84a44d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:19 crc kubenswrapper[4914]: E0130 21:25:19.838994 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators_0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae_0(26f4df32362c14872123102ce5d5ac5dcfd70999c46d21f001d5b026ec84a44d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:19 crc kubenswrapper[4914]: E0130 21:25:19.839052 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators(0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators(0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_openshift-operators_0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae_0(26f4df32362c14872123102ce5d5ac5dcfd70999c46d21f001d5b026ec84a44d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" podUID="0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae" Jan 30 21:25:25 crc kubenswrapper[4914]: I0130 21:25:25.818294 4914 scope.go:117] "RemoveContainer" containerID="c2b89d677b10a0c9096fdbb15c317ca43c6c9d680a668ca53e06449829acfd01" Jan 30 21:25:26 crc kubenswrapper[4914]: I0130 21:25:26.067294 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvbd7_c1067fc5-9bff-4a81-982f-b2cca1c432d0/kube-multus/2.log" Jan 30 21:25:26 crc kubenswrapper[4914]: I0130 21:25:26.067977 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvbd7_c1067fc5-9bff-4a81-982f-b2cca1c432d0/kube-multus/1.log" Jan 30 21:25:26 crc kubenswrapper[4914]: I0130 21:25:26.068031 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvbd7" event={"ID":"c1067fc5-9bff-4a81-982f-b2cca1c432d0","Type":"ContainerStarted","Data":"4bd392a6d2868b4c28e9f880a6067d5a1cc50c973d53131ebdae1e5f75e8acb9"} Jan 30 21:25:26 crc kubenswrapper[4914]: I0130 21:25:26.817775 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:26 crc kubenswrapper[4914]: I0130 21:25:26.818261 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:26 crc kubenswrapper[4914]: E0130 21:25:26.851977 4914 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(f7b7635907d15f0edcf5a36d668eb6963479efd9e5a1c864ec3439bb1c9fdac4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:25:26 crc kubenswrapper[4914]: E0130 21:25:26.852063 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(f7b7635907d15f0edcf5a36d668eb6963479efd9e5a1c864ec3439bb1c9fdac4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:26 crc kubenswrapper[4914]: E0130 21:25:26.852102 4914 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(f7b7635907d15f0edcf5a36d668eb6963479efd9e5a1c864ec3439bb1c9fdac4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:26 crc kubenswrapper[4914]: E0130 21:25:26.852172 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-6c2k7_openshift-operators(1b8e78e3-d709-4289-b9aa-15a6270a66d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-6c2k7_openshift-operators(1b8e78e3-d709-4289-b9aa-15a6270a66d0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-6c2k7_openshift-operators_1b8e78e3-d709-4289-b9aa-15a6270a66d0_0(f7b7635907d15f0edcf5a36d668eb6963479efd9e5a1c864ec3439bb1c9fdac4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" podUID="1b8e78e3-d709-4289-b9aa-15a6270a66d0" Jan 30 21:25:27 crc kubenswrapper[4914]: I0130 21:25:27.714272 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rkzbm" Jan 30 21:25:27 crc kubenswrapper[4914]: I0130 21:25:27.818540 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:27 crc kubenswrapper[4914]: I0130 21:25:27.832520 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" Jan 30 21:25:28 crc kubenswrapper[4914]: I0130 21:25:28.054209 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9752c"] Jan 30 21:25:28 crc kubenswrapper[4914]: W0130 21:25:28.064985 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1633e963_9082_4659_af26_20bc3b1e512b.slice/crio-c2dfda8beea250b7df49871915f332e2b1be68e9db2a5a055d9bb2ac62bfa4ff WatchSource:0}: Error finding container c2dfda8beea250b7df49871915f332e2b1be68e9db2a5a055d9bb2ac62bfa4ff: Status 404 returned error can't find the container with id c2dfda8beea250b7df49871915f332e2b1be68e9db2a5a055d9bb2ac62bfa4ff Jan 30 21:25:28 crc kubenswrapper[4914]: I0130 21:25:28.080996 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" event={"ID":"1633e963-9082-4659-af26-20bc3b1e512b","Type":"ContainerStarted","Data":"c2dfda8beea250b7df49871915f332e2b1be68e9db2a5a055d9bb2ac62bfa4ff"} Jan 30 21:25:28 crc kubenswrapper[4914]: I0130 21:25:28.257718 4914 scope.go:117] "RemoveContainer" containerID="556e77646daeedff4e7f95f018b7c7bec78863ade5c39385eb31ec26341e4d7d" Jan 30 21:25:29 crc kubenswrapper[4914]: I0130 21:25:29.090126 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvbd7_c1067fc5-9bff-4a81-982f-b2cca1c432d0/kube-multus/2.log" Jan 30 21:25:30 crc kubenswrapper[4914]: I0130 21:25:30.818500 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:30 crc kubenswrapper[4914]: I0130 21:25:30.819531 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:31 crc kubenswrapper[4914]: I0130 21:25:31.817917 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:31 crc kubenswrapper[4914]: I0130 21:25:31.817974 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:31 crc kubenswrapper[4914]: I0130 21:25:31.818509 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" Jan 30 21:25:31 crc kubenswrapper[4914]: I0130 21:25:31.818980 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" Jan 30 21:25:32 crc kubenswrapper[4914]: I0130 21:25:32.055284 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz"] Jan 30 21:25:32 crc kubenswrapper[4914]: I0130 21:25:32.104327 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" event={"ID":"128f28df-6fdd-4a2c-86af-6dfe33baf2c9","Type":"ContainerStarted","Data":"4491313214d62c6169ca5545f1dc033d8f91820d242192a9d6035d8f63a566f3"} Jan 30 21:25:32 crc kubenswrapper[4914]: I0130 21:25:32.336965 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb"] Jan 30 21:25:32 crc kubenswrapper[4914]: I0130 21:25:32.342815 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-gf6nm"] Jan 30 21:25:32 crc kubenswrapper[4914]: W0130 21:25:32.347731 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c6a7dac_bcc3_4acc_a5c0_aa26c17e28ae.slice/crio-11183b6fefba10ace2f05cdcf9891d356bc540e080130d9c64a44c4cd6239ba2 WatchSource:0}: Error finding container 11183b6fefba10ace2f05cdcf9891d356bc540e080130d9c64a44c4cd6239ba2: Status 404 returned error can't find the container with id 11183b6fefba10ace2f05cdcf9891d356bc540e080130d9c64a44c4cd6239ba2 Jan 30 21:25:32 crc kubenswrapper[4914]: W0130 21:25:32.349688 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f547924_f70d_41f4_8461_57953f81d9ac.slice/crio-a2daaf584e7b2db8ff1350318bb9af9d6374f9efe7f6bad38fe778016fb92c43 WatchSource:0}: Error finding container a2daaf584e7b2db8ff1350318bb9af9d6374f9efe7f6bad38fe778016fb92c43: Status 404 returned error can't find the container with id a2daaf584e7b2db8ff1350318bb9af9d6374f9efe7f6bad38fe778016fb92c43 Jan 30 21:25:33 crc kubenswrapper[4914]: I0130 21:25:33.114629 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" event={"ID":"0f547924-f70d-41f4-8461-57953f81d9ac","Type":"ContainerStarted","Data":"a2daaf584e7b2db8ff1350318bb9af9d6374f9efe7f6bad38fe778016fb92c43"} Jan 30 21:25:33 crc kubenswrapper[4914]: I0130 21:25:33.116429 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" event={"ID":"0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae","Type":"ContainerStarted","Data":"11183b6fefba10ace2f05cdcf9891d356bc540e080130d9c64a44c4cd6239ba2"} Jan 30 21:25:33 crc kubenswrapper[4914]: I0130 21:25:33.118546 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" event={"ID":"1633e963-9082-4659-af26-20bc3b1e512b","Type":"ContainerStarted","Data":"618d4e2fc576a964e81cbbd2f8890f3c6d1723be328122e5e1d4007d782af18c"} Jan 30 21:25:33 crc kubenswrapper[4914]: I0130 21:25:33.150526 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9752c" podStartSLOduration=26.264436937 podStartE2EDuration="30.15050683s" podCreationTimestamp="2026-01-30 21:25:03 +0000 UTC" firstStartedPulling="2026-01-30 21:25:28.067096353 +0000 UTC m=+661.505733134" lastFinishedPulling="2026-01-30 21:25:31.953166266 +0000 UTC m=+665.391803027" observedRunningTime="2026-01-30 21:25:33.146546013 +0000 UTC m=+666.585182794" watchObservedRunningTime="2026-01-30 21:25:33.15050683 +0000 UTC m=+666.589143601" Jan 30 21:25:35 crc kubenswrapper[4914]: I0130 21:25:35.160274 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" event={"ID":"128f28df-6fdd-4a2c-86af-6dfe33baf2c9","Type":"ContainerStarted","Data":"73d7799e9ea82e7bae2c52f4157c0b725bf18f544849c6193f932541a7c3f23f"} Jan 30 21:25:35 crc kubenswrapper[4914]: I0130 21:25:35.176486 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" event={"ID":"0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae","Type":"ContainerStarted","Data":"0d5b988ef9a1702198758f20095643b457a765a3c0b1ab9e70c0e728d674d0b3"} Jan 30 21:25:35 crc kubenswrapper[4914]: I0130 21:25:35.252153 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz" podStartSLOduration=29.969612501 podStartE2EDuration="32.252134269s" podCreationTimestamp="2026-01-30 21:25:03 +0000 UTC" firstStartedPulling="2026-01-30 21:25:32.08150945 +0000 UTC m=+665.520146211" lastFinishedPulling="2026-01-30 21:25:34.364031218 +0000 UTC m=+667.802667979" observedRunningTime="2026-01-30 21:25:35.211588328 +0000 UTC m=+668.650225089" watchObservedRunningTime="2026-01-30 21:25:35.252134269 +0000 UTC m=+668.690771050" Jan 30 21:25:35 crc kubenswrapper[4914]: I0130 21:25:35.258520 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb" podStartSLOduration=30.251310052 podStartE2EDuration="32.258501494s" podCreationTimestamp="2026-01-30 21:25:03 +0000 UTC" firstStartedPulling="2026-01-30 21:25:32.352149931 +0000 UTC m=+665.790786732" lastFinishedPulling="2026-01-30 21:25:34.359341413 +0000 UTC m=+667.797978174" observedRunningTime="2026-01-30 21:25:35.241155591 +0000 UTC m=+668.679792352" watchObservedRunningTime="2026-01-30 21:25:35.258501494 +0000 UTC m=+668.697138255" Jan 30 21:25:38 crc kubenswrapper[4914]: I0130 21:25:38.196028 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" event={"ID":"0f547924-f70d-41f4-8461-57953f81d9ac","Type":"ContainerStarted","Data":"85dc767eb4e3d2179abfc05115325ed613d9a4868c403a3dc29d459a85bf80c8"} Jan 30 21:25:38 crc kubenswrapper[4914]: I0130 21:25:38.196503 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:38 crc kubenswrapper[4914]: I0130 21:25:38.199781 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" Jan 30 21:25:38 crc kubenswrapper[4914]: I0130 21:25:38.226824 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-gf6nm" podStartSLOduration=29.863070759 podStartE2EDuration="35.226799011s" podCreationTimestamp="2026-01-30 21:25:03 +0000 UTC" firstStartedPulling="2026-01-30 21:25:32.353086674 +0000 UTC m=+665.791723435" lastFinishedPulling="2026-01-30 21:25:37.716814886 +0000 UTC m=+671.155451687" observedRunningTime="2026-01-30 21:25:38.217858753 +0000 UTC m=+671.656495554" watchObservedRunningTime="2026-01-30 21:25:38.226799011 +0000 UTC m=+671.665435812" Jan 30 21:25:41 crc kubenswrapper[4914]: I0130 21:25:41.818516 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:41 crc kubenswrapper[4914]: I0130 21:25:41.819670 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:42 crc kubenswrapper[4914]: I0130 21:25:42.069602 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-6c2k7"] Jan 30 21:25:42 crc kubenswrapper[4914]: W0130 21:25:42.079406 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b8e78e3_d709_4289_b9aa_15a6270a66d0.slice/crio-a19e81b91bf2e9d526a156ebaaf59bb95b9bce8969e1d0e23ad00a1710088d8a WatchSource:0}: Error finding container a19e81b91bf2e9d526a156ebaaf59bb95b9bce8969e1d0e23ad00a1710088d8a: Status 404 returned error can't find the container with id a19e81b91bf2e9d526a156ebaaf59bb95b9bce8969e1d0e23ad00a1710088d8a Jan 30 21:25:42 crc kubenswrapper[4914]: I0130 21:25:42.232148 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" event={"ID":"1b8e78e3-d709-4289-b9aa-15a6270a66d0","Type":"ContainerStarted","Data":"a19e81b91bf2e9d526a156ebaaf59bb95b9bce8969e1d0e23ad00a1710088d8a"} Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.576022 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-s2t9k"] Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.578293 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2t9k" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.582763 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-x54cc"] Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.583853 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-x54cc" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.584302 4914 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-d66k6" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.586420 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.586421 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.586626 4914 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-68kkm" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.586795 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-s2t9k"] Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.596933 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-fzp8x"] Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.597795 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-fzp8x" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.599265 4914 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-f9vf8" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.603596 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-x54cc"] Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.612138 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-fzp8x"] Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.691034 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75zr8\" (UniqueName: \"kubernetes.io/projected/5bc05e58-c6ae-4998-9ea7-1f60ec131e48-kube-api-access-75zr8\") pod \"cert-manager-858654f9db-x54cc\" (UID: \"5bc05e58-c6ae-4998-9ea7-1f60ec131e48\") " pod="cert-manager/cert-manager-858654f9db-x54cc" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.691076 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8rkz\" (UniqueName: \"kubernetes.io/projected/a22759e4-6040-4a43-affd-a278ced19421-kube-api-access-r8rkz\") pod \"cert-manager-cainjector-cf98fcc89-s2t9k\" (UID: \"a22759e4-6040-4a43-affd-a278ced19421\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2t9k" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.691334 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd8xf\" (UniqueName: \"kubernetes.io/projected/814783bd-aa98-42ce-9cbe-8afdaa508449-kube-api-access-sd8xf\") pod \"cert-manager-webhook-687f57d79b-fzp8x\" (UID: \"814783bd-aa98-42ce-9cbe-8afdaa508449\") " pod="cert-manager/cert-manager-webhook-687f57d79b-fzp8x" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.792787 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75zr8\" (UniqueName: \"kubernetes.io/projected/5bc05e58-c6ae-4998-9ea7-1f60ec131e48-kube-api-access-75zr8\") pod \"cert-manager-858654f9db-x54cc\" (UID: \"5bc05e58-c6ae-4998-9ea7-1f60ec131e48\") " pod="cert-manager/cert-manager-858654f9db-x54cc" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.792832 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8rkz\" (UniqueName: \"kubernetes.io/projected/a22759e4-6040-4a43-affd-a278ced19421-kube-api-access-r8rkz\") pod \"cert-manager-cainjector-cf98fcc89-s2t9k\" (UID: \"a22759e4-6040-4a43-affd-a278ced19421\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2t9k" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.792852 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd8xf\" (UniqueName: \"kubernetes.io/projected/814783bd-aa98-42ce-9cbe-8afdaa508449-kube-api-access-sd8xf\") pod \"cert-manager-webhook-687f57d79b-fzp8x\" (UID: \"814783bd-aa98-42ce-9cbe-8afdaa508449\") " pod="cert-manager/cert-manager-webhook-687f57d79b-fzp8x" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.809151 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75zr8\" (UniqueName: \"kubernetes.io/projected/5bc05e58-c6ae-4998-9ea7-1f60ec131e48-kube-api-access-75zr8\") pod \"cert-manager-858654f9db-x54cc\" (UID: \"5bc05e58-c6ae-4998-9ea7-1f60ec131e48\") " pod="cert-manager/cert-manager-858654f9db-x54cc" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.809182 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd8xf\" (UniqueName: \"kubernetes.io/projected/814783bd-aa98-42ce-9cbe-8afdaa508449-kube-api-access-sd8xf\") pod \"cert-manager-webhook-687f57d79b-fzp8x\" (UID: \"814783bd-aa98-42ce-9cbe-8afdaa508449\") " pod="cert-manager/cert-manager-webhook-687f57d79b-fzp8x" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.814719 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8rkz\" (UniqueName: \"kubernetes.io/projected/a22759e4-6040-4a43-affd-a278ced19421-kube-api-access-r8rkz\") pod \"cert-manager-cainjector-cf98fcc89-s2t9k\" (UID: \"a22759e4-6040-4a43-affd-a278ced19421\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2t9k" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.902149 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2t9k" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.936345 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-x54cc" Jan 30 21:25:44 crc kubenswrapper[4914]: I0130 21:25:44.943692 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-fzp8x" Jan 30 21:25:45 crc kubenswrapper[4914]: I0130 21:25:45.260379 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" event={"ID":"1b8e78e3-d709-4289-b9aa-15a6270a66d0","Type":"ContainerStarted","Data":"d29ad64edb4d176b5a2b5c9d742d891efe57bd7d3e257a8649af59223a83a460"} Jan 30 21:25:45 crc kubenswrapper[4914]: I0130 21:25:45.260808 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:45 crc kubenswrapper[4914]: I0130 21:25:45.292125 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" podStartSLOduration=39.435846582 podStartE2EDuration="42.292105232s" podCreationTimestamp="2026-01-30 21:25:03 +0000 UTC" firstStartedPulling="2026-01-30 21:25:42.081683032 +0000 UTC m=+675.520319803" lastFinishedPulling="2026-01-30 21:25:44.937941682 +0000 UTC m=+678.376578453" observedRunningTime="2026-01-30 21:25:45.289892488 +0000 UTC m=+678.728529259" watchObservedRunningTime="2026-01-30 21:25:45.292105232 +0000 UTC m=+678.730741993" Jan 30 21:25:45 crc kubenswrapper[4914]: I0130 21:25:45.361560 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-s2t9k"] Jan 30 21:25:45 crc kubenswrapper[4914]: W0130 21:25:45.364841 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda22759e4_6040_4a43_affd_a278ced19421.slice/crio-88da902b5fa657c7aa8bc2feab5a763114817640bb8f05527eb92bb53ec05ce8 WatchSource:0}: Error finding container 88da902b5fa657c7aa8bc2feab5a763114817640bb8f05527eb92bb53ec05ce8: Status 404 returned error can't find the container with id 88da902b5fa657c7aa8bc2feab5a763114817640bb8f05527eb92bb53ec05ce8 Jan 30 21:25:45 crc kubenswrapper[4914]: I0130 21:25:45.422397 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-x54cc"] Jan 30 21:25:45 crc kubenswrapper[4914]: I0130 21:25:45.430686 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-fzp8x"] Jan 30 21:25:45 crc kubenswrapper[4914]: W0130 21:25:45.431983 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bc05e58_c6ae_4998_9ea7_1f60ec131e48.slice/crio-cb9e2f9fe91a2a4851eecf0ca5f5857afe692ac94405ba3e62e575bb21beca3a WatchSource:0}: Error finding container cb9e2f9fe91a2a4851eecf0ca5f5857afe692ac94405ba3e62e575bb21beca3a: Status 404 returned error can't find the container with id cb9e2f9fe91a2a4851eecf0ca5f5857afe692ac94405ba3e62e575bb21beca3a Jan 30 21:25:45 crc kubenswrapper[4914]: W0130 21:25:45.436008 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod814783bd_aa98_42ce_9cbe_8afdaa508449.slice/crio-8bf45f18fcddc412cca1f67a75fed1313d2825fa17c291a8db52300ae5019044 WatchSource:0}: Error finding container 8bf45f18fcddc412cca1f67a75fed1313d2825fa17c291a8db52300ae5019044: Status 404 returned error can't find the container with id 8bf45f18fcddc412cca1f67a75fed1313d2825fa17c291a8db52300ae5019044 Jan 30 21:25:46 crc kubenswrapper[4914]: I0130 21:25:46.271687 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-fzp8x" event={"ID":"814783bd-aa98-42ce-9cbe-8afdaa508449","Type":"ContainerStarted","Data":"8bf45f18fcddc412cca1f67a75fed1313d2825fa17c291a8db52300ae5019044"} Jan 30 21:25:46 crc kubenswrapper[4914]: I0130 21:25:46.273399 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-x54cc" event={"ID":"5bc05e58-c6ae-4998-9ea7-1f60ec131e48","Type":"ContainerStarted","Data":"cb9e2f9fe91a2a4851eecf0ca5f5857afe692ac94405ba3e62e575bb21beca3a"} Jan 30 21:25:46 crc kubenswrapper[4914]: I0130 21:25:46.275230 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2t9k" event={"ID":"a22759e4-6040-4a43-affd-a278ced19421","Type":"ContainerStarted","Data":"88da902b5fa657c7aa8bc2feab5a763114817640bb8f05527eb92bb53ec05ce8"} Jan 30 21:25:50 crc kubenswrapper[4914]: I0130 21:25:50.299967 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-fzp8x" event={"ID":"814783bd-aa98-42ce-9cbe-8afdaa508449","Type":"ContainerStarted","Data":"a5807c28dc178032a75982c124fe52e9e27c526c2cdf5f3a1e9054352c5dd109"} Jan 30 21:25:50 crc kubenswrapper[4914]: I0130 21:25:50.300598 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-fzp8x" Jan 30 21:25:50 crc kubenswrapper[4914]: I0130 21:25:50.302503 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-x54cc" event={"ID":"5bc05e58-c6ae-4998-9ea7-1f60ec131e48","Type":"ContainerStarted","Data":"6d4687dac9ebd4343646833fb40ae79059c55c80d9381386c073dc0666a5fc33"} Jan 30 21:25:50 crc kubenswrapper[4914]: I0130 21:25:50.305082 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2t9k" event={"ID":"a22759e4-6040-4a43-affd-a278ced19421","Type":"ContainerStarted","Data":"12594931b50bdc5abbc3338422a7a81b03a0623db9097b374f113f88bfc12325"} Jan 30 21:25:50 crc kubenswrapper[4914]: I0130 21:25:50.324450 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-fzp8x" podStartSLOduration=2.421490156 podStartE2EDuration="6.324423131s" podCreationTimestamp="2026-01-30 21:25:44 +0000 UTC" firstStartedPulling="2026-01-30 21:25:45.437949544 +0000 UTC m=+678.876586305" lastFinishedPulling="2026-01-30 21:25:49.340882519 +0000 UTC m=+682.779519280" observedRunningTime="2026-01-30 21:25:50.318126377 +0000 UTC m=+683.756763148" watchObservedRunningTime="2026-01-30 21:25:50.324423131 +0000 UTC m=+683.763059902" Jan 30 21:25:50 crc kubenswrapper[4914]: I0130 21:25:50.341690 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-s2t9k" podStartSLOduration=2.36800188 podStartE2EDuration="6.341663222s" podCreationTimestamp="2026-01-30 21:25:44 +0000 UTC" firstStartedPulling="2026-01-30 21:25:45.366858728 +0000 UTC m=+678.805495489" lastFinishedPulling="2026-01-30 21:25:49.34052006 +0000 UTC m=+682.779156831" observedRunningTime="2026-01-30 21:25:50.334812235 +0000 UTC m=+683.773449006" watchObservedRunningTime="2026-01-30 21:25:50.341663222 +0000 UTC m=+683.780300013" Jan 30 21:25:50 crc kubenswrapper[4914]: I0130 21:25:50.393279 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-x54cc" podStartSLOduration=2.497235967 podStartE2EDuration="6.393257292s" podCreationTimestamp="2026-01-30 21:25:44 +0000 UTC" firstStartedPulling="2026-01-30 21:25:45.437926344 +0000 UTC m=+678.876563115" lastFinishedPulling="2026-01-30 21:25:49.333947679 +0000 UTC m=+682.772584440" observedRunningTime="2026-01-30 21:25:50.391804677 +0000 UTC m=+683.830441488" watchObservedRunningTime="2026-01-30 21:25:50.393257292 +0000 UTC m=+683.831894073" Jan 30 21:25:54 crc kubenswrapper[4914]: I0130 21:25:54.119952 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-6c2k7" Jan 30 21:25:54 crc kubenswrapper[4914]: I0130 21:25:54.947829 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-fzp8x" Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.108896 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk"] Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.110672 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.112477 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.121175 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk"] Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.166734 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e36ddd94-dd2d-41d1-b22b-805928956f0d-util\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk\" (UID: \"e36ddd94-dd2d-41d1-b22b-805928956f0d\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.166833 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trd2w\" (UniqueName: \"kubernetes.io/projected/e36ddd94-dd2d-41d1-b22b-805928956f0d-kube-api-access-trd2w\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk\" (UID: \"e36ddd94-dd2d-41d1-b22b-805928956f0d\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.166866 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e36ddd94-dd2d-41d1-b22b-805928956f0d-bundle\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk\" (UID: \"e36ddd94-dd2d-41d1-b22b-805928956f0d\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.268434 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e36ddd94-dd2d-41d1-b22b-805928956f0d-util\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk\" (UID: \"e36ddd94-dd2d-41d1-b22b-805928956f0d\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.268542 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trd2w\" (UniqueName: \"kubernetes.io/projected/e36ddd94-dd2d-41d1-b22b-805928956f0d-kube-api-access-trd2w\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk\" (UID: \"e36ddd94-dd2d-41d1-b22b-805928956f0d\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.268574 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e36ddd94-dd2d-41d1-b22b-805928956f0d-bundle\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk\" (UID: \"e36ddd94-dd2d-41d1-b22b-805928956f0d\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.269027 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e36ddd94-dd2d-41d1-b22b-805928956f0d-util\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk\" (UID: \"e36ddd94-dd2d-41d1-b22b-805928956f0d\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.269129 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e36ddd94-dd2d-41d1-b22b-805928956f0d-bundle\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk\" (UID: \"e36ddd94-dd2d-41d1-b22b-805928956f0d\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.293403 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trd2w\" (UniqueName: \"kubernetes.io/projected/e36ddd94-dd2d-41d1-b22b-805928956f0d-kube-api-access-trd2w\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk\" (UID: \"e36ddd94-dd2d-41d1-b22b-805928956f0d\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.427591 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" Jan 30 21:26:18 crc kubenswrapper[4914]: I0130 21:26:18.866134 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk"] Jan 30 21:26:19 crc kubenswrapper[4914]: I0130 21:26:19.508513 4914 generic.go:334] "Generic (PLEG): container finished" podID="e36ddd94-dd2d-41d1-b22b-805928956f0d" containerID="dde48fdd009e9833ffcc2ca52b8ff6299a803d6b49278ad214891b78e79a2e60" exitCode=0 Jan 30 21:26:19 crc kubenswrapper[4914]: I0130 21:26:19.508593 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" event={"ID":"e36ddd94-dd2d-41d1-b22b-805928956f0d","Type":"ContainerDied","Data":"dde48fdd009e9833ffcc2ca52b8ff6299a803d6b49278ad214891b78e79a2e60"} Jan 30 21:26:19 crc kubenswrapper[4914]: I0130 21:26:19.508920 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" event={"ID":"e36ddd94-dd2d-41d1-b22b-805928956f0d","Type":"ContainerStarted","Data":"34208d604b731e253375ecf24053c4e49bdf7cb925f640b9d6d58f49f1fabc72"} Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.057922 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.059213 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.062563 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.062910 4914 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-w99r7" Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.069340 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.072007 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.196034 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqg9n\" (UniqueName: \"kubernetes.io/projected/71c75e18-b4f6-4ff6-a0c7-2994a64ef14f-kube-api-access-zqg9n\") pod \"minio\" (UID: \"71c75e18-b4f6-4ff6-a0c7-2994a64ef14f\") " pod="minio-dev/minio" Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.196700 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-80a48671-d462-460e-8571-7bef70604439\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-80a48671-d462-460e-8571-7bef70604439\") pod \"minio\" (UID: \"71c75e18-b4f6-4ff6-a0c7-2994a64ef14f\") " pod="minio-dev/minio" Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.298346 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-80a48671-d462-460e-8571-7bef70604439\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-80a48671-d462-460e-8571-7bef70604439\") pod \"minio\" (UID: \"71c75e18-b4f6-4ff6-a0c7-2994a64ef14f\") " pod="minio-dev/minio" Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.298457 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqg9n\" (UniqueName: \"kubernetes.io/projected/71c75e18-b4f6-4ff6-a0c7-2994a64ef14f-kube-api-access-zqg9n\") pod \"minio\" (UID: \"71c75e18-b4f6-4ff6-a0c7-2994a64ef14f\") " pod="minio-dev/minio" Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.322185 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.322237 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-80a48671-d462-460e-8571-7bef70604439\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-80a48671-d462-460e-8571-7bef70604439\") pod \"minio\" (UID: \"71c75e18-b4f6-4ff6-a0c7-2994a64ef14f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/86866c445b5cf22e1d03304a7fce45a53da1f01df4dd7cd9ff0cfb2a53638c4a/globalmount\"" pod="minio-dev/minio" Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.335047 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqg9n\" (UniqueName: \"kubernetes.io/projected/71c75e18-b4f6-4ff6-a0c7-2994a64ef14f-kube-api-access-zqg9n\") pod \"minio\" (UID: \"71c75e18-b4f6-4ff6-a0c7-2994a64ef14f\") " pod="minio-dev/minio" Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.423377 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-80a48671-d462-460e-8571-7bef70604439\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-80a48671-d462-460e-8571-7bef70604439\") pod \"minio\" (UID: \"71c75e18-b4f6-4ff6-a0c7-2994a64ef14f\") " pod="minio-dev/minio" Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.685993 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Jan 30 21:26:20 crc kubenswrapper[4914]: I0130 21:26:20.951971 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Jan 30 21:26:20 crc kubenswrapper[4914]: W0130 21:26:20.962805 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71c75e18_b4f6_4ff6_a0c7_2994a64ef14f.slice/crio-9e07e6b3ebb39222cecc2218a5e73a5724d17a80c388b38b9bba142753fdf36c WatchSource:0}: Error finding container 9e07e6b3ebb39222cecc2218a5e73a5724d17a80c388b38b9bba142753fdf36c: Status 404 returned error can't find the container with id 9e07e6b3ebb39222cecc2218a5e73a5724d17a80c388b38b9bba142753fdf36c Jan 30 21:26:21 crc kubenswrapper[4914]: I0130 21:26:21.535140 4914 generic.go:334] "Generic (PLEG): container finished" podID="e36ddd94-dd2d-41d1-b22b-805928956f0d" containerID="05c504bcee87a4166e0133adaf0ad9348ac13f2c935128abbae0814cd130e322" exitCode=0 Jan 30 21:26:21 crc kubenswrapper[4914]: I0130 21:26:21.535215 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" event={"ID":"e36ddd94-dd2d-41d1-b22b-805928956f0d","Type":"ContainerDied","Data":"05c504bcee87a4166e0133adaf0ad9348ac13f2c935128abbae0814cd130e322"} Jan 30 21:26:21 crc kubenswrapper[4914]: I0130 21:26:21.538288 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"71c75e18-b4f6-4ff6-a0c7-2994a64ef14f","Type":"ContainerStarted","Data":"9e07e6b3ebb39222cecc2218a5e73a5724d17a80c388b38b9bba142753fdf36c"} Jan 30 21:26:22 crc kubenswrapper[4914]: I0130 21:26:22.546517 4914 generic.go:334] "Generic (PLEG): container finished" podID="e36ddd94-dd2d-41d1-b22b-805928956f0d" containerID="8859e9f6be157f94e1ecfc19f25fba3385983cb0b085554b4ba2790d23c06cb2" exitCode=0 Jan 30 21:26:22 crc kubenswrapper[4914]: I0130 21:26:22.546601 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" event={"ID":"e36ddd94-dd2d-41d1-b22b-805928956f0d","Type":"ContainerDied","Data":"8859e9f6be157f94e1ecfc19f25fba3385983cb0b085554b4ba2790d23c06cb2"} Jan 30 21:26:24 crc kubenswrapper[4914]: I0130 21:26:24.154457 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" Jan 30 21:26:24 crc kubenswrapper[4914]: I0130 21:26:24.278240 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e36ddd94-dd2d-41d1-b22b-805928956f0d-bundle\") pod \"e36ddd94-dd2d-41d1-b22b-805928956f0d\" (UID: \"e36ddd94-dd2d-41d1-b22b-805928956f0d\") " Jan 30 21:26:24 crc kubenswrapper[4914]: I0130 21:26:24.278321 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e36ddd94-dd2d-41d1-b22b-805928956f0d-util\") pod \"e36ddd94-dd2d-41d1-b22b-805928956f0d\" (UID: \"e36ddd94-dd2d-41d1-b22b-805928956f0d\") " Jan 30 21:26:24 crc kubenswrapper[4914]: I0130 21:26:24.278346 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trd2w\" (UniqueName: \"kubernetes.io/projected/e36ddd94-dd2d-41d1-b22b-805928956f0d-kube-api-access-trd2w\") pod \"e36ddd94-dd2d-41d1-b22b-805928956f0d\" (UID: \"e36ddd94-dd2d-41d1-b22b-805928956f0d\") " Jan 30 21:26:24 crc kubenswrapper[4914]: I0130 21:26:24.280070 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e36ddd94-dd2d-41d1-b22b-805928956f0d-bundle" (OuterVolumeSpecName: "bundle") pod "e36ddd94-dd2d-41d1-b22b-805928956f0d" (UID: "e36ddd94-dd2d-41d1-b22b-805928956f0d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:26:24 crc kubenswrapper[4914]: I0130 21:26:24.284051 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e36ddd94-dd2d-41d1-b22b-805928956f0d-kube-api-access-trd2w" (OuterVolumeSpecName: "kube-api-access-trd2w") pod "e36ddd94-dd2d-41d1-b22b-805928956f0d" (UID: "e36ddd94-dd2d-41d1-b22b-805928956f0d"). InnerVolumeSpecName "kube-api-access-trd2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:26:24 crc kubenswrapper[4914]: I0130 21:26:24.380433 4914 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e36ddd94-dd2d-41d1-b22b-805928956f0d-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:26:24 crc kubenswrapper[4914]: I0130 21:26:24.380466 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trd2w\" (UniqueName: \"kubernetes.io/projected/e36ddd94-dd2d-41d1-b22b-805928956f0d-kube-api-access-trd2w\") on node \"crc\" DevicePath \"\"" Jan 30 21:26:24 crc kubenswrapper[4914]: I0130 21:26:24.580832 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" event={"ID":"e36ddd94-dd2d-41d1-b22b-805928956f0d","Type":"ContainerDied","Data":"34208d604b731e253375ecf24053c4e49bdf7cb925f640b9d6d58f49f1fabc72"} Jan 30 21:26:24 crc kubenswrapper[4914]: I0130 21:26:24.580870 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34208d604b731e253375ecf24053c4e49bdf7cb925f640b9d6d58f49f1fabc72" Jan 30 21:26:24 crc kubenswrapper[4914]: I0130 21:26:24.580942 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk" Jan 30 21:26:24 crc kubenswrapper[4914]: I0130 21:26:24.608945 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e36ddd94-dd2d-41d1-b22b-805928956f0d-util" (OuterVolumeSpecName: "util") pod "e36ddd94-dd2d-41d1-b22b-805928956f0d" (UID: "e36ddd94-dd2d-41d1-b22b-805928956f0d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:26:24 crc kubenswrapper[4914]: I0130 21:26:24.685333 4914 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e36ddd94-dd2d-41d1-b22b-805928956f0d-util\") on node \"crc\" DevicePath \"\"" Jan 30 21:26:25 crc kubenswrapper[4914]: I0130 21:26:25.589808 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"71c75e18-b4f6-4ff6-a0c7-2994a64ef14f","Type":"ContainerStarted","Data":"71c5368b2cd7fb5eacec5d7e55e0069c8223adc2d17838815ce32e282d304310"} Jan 30 21:26:25 crc kubenswrapper[4914]: I0130 21:26:25.630290 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=5.094589363 podStartE2EDuration="8.630263104s" podCreationTimestamp="2026-01-30 21:26:17 +0000 UTC" firstStartedPulling="2026-01-30 21:26:20.965250963 +0000 UTC m=+714.403887734" lastFinishedPulling="2026-01-30 21:26:24.500924704 +0000 UTC m=+717.939561475" observedRunningTime="2026-01-30 21:26:25.61794492 +0000 UTC m=+719.056581711" watchObservedRunningTime="2026-01-30 21:26:25.630263104 +0000 UTC m=+719.068899895" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.074058 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5"] Jan 30 21:26:31 crc kubenswrapper[4914]: E0130 21:26:31.074896 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36ddd94-dd2d-41d1-b22b-805928956f0d" containerName="util" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.074912 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36ddd94-dd2d-41d1-b22b-805928956f0d" containerName="util" Jan 30 21:26:31 crc kubenswrapper[4914]: E0130 21:26:31.074928 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36ddd94-dd2d-41d1-b22b-805928956f0d" containerName="extract" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.074935 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36ddd94-dd2d-41d1-b22b-805928956f0d" containerName="extract" Jan 30 21:26:31 crc kubenswrapper[4914]: E0130 21:26:31.074945 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36ddd94-dd2d-41d1-b22b-805928956f0d" containerName="pull" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.074952 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36ddd94-dd2d-41d1-b22b-805928956f0d" containerName="pull" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.075084 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="e36ddd94-dd2d-41d1-b22b-805928956f0d" containerName="extract" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.075765 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.077507 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-jttj2" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.077510 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.077685 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.077896 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.078039 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.078400 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.102422 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5"] Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.243639 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/863c64b0-0be9-464d-973a-2bbfc89a6ff0-webhook-cert\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.243697 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/863c64b0-0be9-464d-973a-2bbfc89a6ff0-manager-config\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.243844 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx4wc\" (UniqueName: \"kubernetes.io/projected/863c64b0-0be9-464d-973a-2bbfc89a6ff0-kube-api-access-bx4wc\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.243903 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/863c64b0-0be9-464d-973a-2bbfc89a6ff0-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.243955 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/863c64b0-0be9-464d-973a-2bbfc89a6ff0-apiservice-cert\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.344527 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx4wc\" (UniqueName: \"kubernetes.io/projected/863c64b0-0be9-464d-973a-2bbfc89a6ff0-kube-api-access-bx4wc\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.344567 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/863c64b0-0be9-464d-973a-2bbfc89a6ff0-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.344591 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/863c64b0-0be9-464d-973a-2bbfc89a6ff0-apiservice-cert\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.344632 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/863c64b0-0be9-464d-973a-2bbfc89a6ff0-webhook-cert\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.344654 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/863c64b0-0be9-464d-973a-2bbfc89a6ff0-manager-config\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.345454 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/863c64b0-0be9-464d-973a-2bbfc89a6ff0-manager-config\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.349915 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/863c64b0-0be9-464d-973a-2bbfc89a6ff0-webhook-cert\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.353182 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/863c64b0-0be9-464d-973a-2bbfc89a6ff0-apiservice-cert\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.354364 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/863c64b0-0be9-464d-973a-2bbfc89a6ff0-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.389587 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx4wc\" (UniqueName: \"kubernetes.io/projected/863c64b0-0be9-464d-973a-2bbfc89a6ff0-kube-api-access-bx4wc\") pod \"loki-operator-controller-manager-6cc9c48657-sbpc5\" (UID: \"863c64b0-0be9-464d-973a-2bbfc89a6ff0\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.688503 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:31 crc kubenswrapper[4914]: I0130 21:26:31.901842 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5"] Jan 30 21:26:32 crc kubenswrapper[4914]: I0130 21:26:32.634778 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" event={"ID":"863c64b0-0be9-464d-973a-2bbfc89a6ff0","Type":"ContainerStarted","Data":"9c2311fbe39cc16973a41fb424b7c31596cca0f24cf6943a18a026ac21894543"} Jan 30 21:26:36 crc kubenswrapper[4914]: I0130 21:26:36.668549 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" event={"ID":"863c64b0-0be9-464d-973a-2bbfc89a6ff0","Type":"ContainerStarted","Data":"3940d274d3240dff36d15a1eccc8b639aec2f15dfe0553ba40049454ec3876cc"} Jan 30 21:26:43 crc kubenswrapper[4914]: I0130 21:26:43.712850 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" event={"ID":"863c64b0-0be9-464d-973a-2bbfc89a6ff0","Type":"ContainerStarted","Data":"59ac8555e5c38a6b11a28b7f27d4c920c2656d91829cb0323c3d5c9b8a2dc4db"} Jan 30 21:26:43 crc kubenswrapper[4914]: I0130 21:26:43.713471 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:43 crc kubenswrapper[4914]: I0130 21:26:43.719465 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" Jan 30 21:26:43 crc kubenswrapper[4914]: I0130 21:26:43.774620 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-6cc9c48657-sbpc5" podStartSLOduration=1.593600855 podStartE2EDuration="12.774604561s" podCreationTimestamp="2026-01-30 21:26:31 +0000 UTC" firstStartedPulling="2026-01-30 21:26:31.908805782 +0000 UTC m=+725.347442543" lastFinishedPulling="2026-01-30 21:26:43.089809478 +0000 UTC m=+736.528446249" observedRunningTime="2026-01-30 21:26:43.773768911 +0000 UTC m=+737.212405712" watchObservedRunningTime="2026-01-30 21:26:43.774604561 +0000 UTC m=+737.213241332" Jan 30 21:27:07 crc kubenswrapper[4914]: I0130 21:27:07.733432 4914 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 21:27:17 crc kubenswrapper[4914]: I0130 21:27:17.528051 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g"] Jan 30 21:27:17 crc kubenswrapper[4914]: I0130 21:27:17.529382 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" Jan 30 21:27:17 crc kubenswrapper[4914]: I0130 21:27:17.531905 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 21:27:17 crc kubenswrapper[4914]: I0130 21:27:17.537808 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g"] Jan 30 21:27:17 crc kubenswrapper[4914]: I0130 21:27:17.611876 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jckz8\" (UniqueName: \"kubernetes.io/projected/aab7512f-d12c-4b8c-b07b-45e27163ac4d-kube-api-access-jckz8\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g\" (UID: \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" Jan 30 21:27:17 crc kubenswrapper[4914]: I0130 21:27:17.612060 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab7512f-d12c-4b8c-b07b-45e27163ac4d-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g\" (UID: \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" Jan 30 21:27:17 crc kubenswrapper[4914]: I0130 21:27:17.612135 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab7512f-d12c-4b8c-b07b-45e27163ac4d-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g\" (UID: \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" Jan 30 21:27:17 crc kubenswrapper[4914]: I0130 21:27:17.713145 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jckz8\" (UniqueName: \"kubernetes.io/projected/aab7512f-d12c-4b8c-b07b-45e27163ac4d-kube-api-access-jckz8\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g\" (UID: \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" Jan 30 21:27:17 crc kubenswrapper[4914]: I0130 21:27:17.713211 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab7512f-d12c-4b8c-b07b-45e27163ac4d-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g\" (UID: \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" Jan 30 21:27:17 crc kubenswrapper[4914]: I0130 21:27:17.713233 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab7512f-d12c-4b8c-b07b-45e27163ac4d-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g\" (UID: \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" Jan 30 21:27:17 crc kubenswrapper[4914]: I0130 21:27:17.713677 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab7512f-d12c-4b8c-b07b-45e27163ac4d-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g\" (UID: \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" Jan 30 21:27:17 crc kubenswrapper[4914]: I0130 21:27:17.713696 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab7512f-d12c-4b8c-b07b-45e27163ac4d-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g\" (UID: \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" Jan 30 21:27:17 crc kubenswrapper[4914]: I0130 21:27:17.741956 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jckz8\" (UniqueName: \"kubernetes.io/projected/aab7512f-d12c-4b8c-b07b-45e27163ac4d-kube-api-access-jckz8\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g\" (UID: \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" Jan 30 21:27:17 crc kubenswrapper[4914]: I0130 21:27:17.851608 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" Jan 30 21:27:18 crc kubenswrapper[4914]: I0130 21:27:18.127792 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g"] Jan 30 21:27:18 crc kubenswrapper[4914]: I0130 21:27:18.961906 4914 generic.go:334] "Generic (PLEG): container finished" podID="aab7512f-d12c-4b8c-b07b-45e27163ac4d" containerID="4f5dc08bf519dca53a5d6818ca70377d13544dbc14f6f65e058f588d28b88d76" exitCode=0 Jan 30 21:27:18 crc kubenswrapper[4914]: I0130 21:27:18.961974 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" event={"ID":"aab7512f-d12c-4b8c-b07b-45e27163ac4d","Type":"ContainerDied","Data":"4f5dc08bf519dca53a5d6818ca70377d13544dbc14f6f65e058f588d28b88d76"} Jan 30 21:27:18 crc kubenswrapper[4914]: I0130 21:27:18.962249 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" event={"ID":"aab7512f-d12c-4b8c-b07b-45e27163ac4d","Type":"ContainerStarted","Data":"6dc188381b8e85d7ca32349e3f796a90213c147540094494976b9c9868124202"} Jan 30 21:27:19 crc kubenswrapper[4914]: I0130 21:27:19.416393 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sz6lb"] Jan 30 21:27:19 crc kubenswrapper[4914]: I0130 21:27:19.417838 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:19 crc kubenswrapper[4914]: I0130 21:27:19.438545 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sz6lb"] Jan 30 21:27:19 crc kubenswrapper[4914]: I0130 21:27:19.556391 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7slx\" (UniqueName: \"kubernetes.io/projected/44eb32b5-f694-4d61-a5ec-57be19cddd47-kube-api-access-t7slx\") pod \"redhat-operators-sz6lb\" (UID: \"44eb32b5-f694-4d61-a5ec-57be19cddd47\") " pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:19 crc kubenswrapper[4914]: I0130 21:27:19.556485 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44eb32b5-f694-4d61-a5ec-57be19cddd47-catalog-content\") pod \"redhat-operators-sz6lb\" (UID: \"44eb32b5-f694-4d61-a5ec-57be19cddd47\") " pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:19 crc kubenswrapper[4914]: I0130 21:27:19.556521 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44eb32b5-f694-4d61-a5ec-57be19cddd47-utilities\") pod \"redhat-operators-sz6lb\" (UID: \"44eb32b5-f694-4d61-a5ec-57be19cddd47\") " pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:19 crc kubenswrapper[4914]: I0130 21:27:19.658261 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44eb32b5-f694-4d61-a5ec-57be19cddd47-catalog-content\") pod \"redhat-operators-sz6lb\" (UID: \"44eb32b5-f694-4d61-a5ec-57be19cddd47\") " pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:19 crc kubenswrapper[4914]: I0130 21:27:19.658309 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44eb32b5-f694-4d61-a5ec-57be19cddd47-utilities\") pod \"redhat-operators-sz6lb\" (UID: \"44eb32b5-f694-4d61-a5ec-57be19cddd47\") " pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:19 crc kubenswrapper[4914]: I0130 21:27:19.658358 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7slx\" (UniqueName: \"kubernetes.io/projected/44eb32b5-f694-4d61-a5ec-57be19cddd47-kube-api-access-t7slx\") pod \"redhat-operators-sz6lb\" (UID: \"44eb32b5-f694-4d61-a5ec-57be19cddd47\") " pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:19 crc kubenswrapper[4914]: I0130 21:27:19.658787 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44eb32b5-f694-4d61-a5ec-57be19cddd47-catalog-content\") pod \"redhat-operators-sz6lb\" (UID: \"44eb32b5-f694-4d61-a5ec-57be19cddd47\") " pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:19 crc kubenswrapper[4914]: I0130 21:27:19.658915 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44eb32b5-f694-4d61-a5ec-57be19cddd47-utilities\") pod \"redhat-operators-sz6lb\" (UID: \"44eb32b5-f694-4d61-a5ec-57be19cddd47\") " pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:19 crc kubenswrapper[4914]: I0130 21:27:19.679606 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7slx\" (UniqueName: \"kubernetes.io/projected/44eb32b5-f694-4d61-a5ec-57be19cddd47-kube-api-access-t7slx\") pod \"redhat-operators-sz6lb\" (UID: \"44eb32b5-f694-4d61-a5ec-57be19cddd47\") " pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:19 crc kubenswrapper[4914]: I0130 21:27:19.732119 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:20 crc kubenswrapper[4914]: I0130 21:27:20.137991 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sz6lb"] Jan 30 21:27:20 crc kubenswrapper[4914]: I0130 21:27:20.973780 4914 generic.go:334] "Generic (PLEG): container finished" podID="44eb32b5-f694-4d61-a5ec-57be19cddd47" containerID="8ef1409dcbb3ca0727aa568c4a2373df3b1821334a652d7c3792effc10fe0e78" exitCode=0 Jan 30 21:27:20 crc kubenswrapper[4914]: I0130 21:27:20.973869 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6lb" event={"ID":"44eb32b5-f694-4d61-a5ec-57be19cddd47","Type":"ContainerDied","Data":"8ef1409dcbb3ca0727aa568c4a2373df3b1821334a652d7c3792effc10fe0e78"} Jan 30 21:27:20 crc kubenswrapper[4914]: I0130 21:27:20.974071 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6lb" event={"ID":"44eb32b5-f694-4d61-a5ec-57be19cddd47","Type":"ContainerStarted","Data":"46db89869026ed5ba324027383fc2c745eba25de9152474ed4157bb4c0b26b94"} Jan 30 21:27:20 crc kubenswrapper[4914]: I0130 21:27:20.976117 4914 generic.go:334] "Generic (PLEG): container finished" podID="aab7512f-d12c-4b8c-b07b-45e27163ac4d" containerID="33526c0937e3c70b1c80dd6072e15d1422f26a004659d0ce1870732859a2bb90" exitCode=0 Jan 30 21:27:20 crc kubenswrapper[4914]: I0130 21:27:20.976161 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" event={"ID":"aab7512f-d12c-4b8c-b07b-45e27163ac4d","Type":"ContainerDied","Data":"33526c0937e3c70b1c80dd6072e15d1422f26a004659d0ce1870732859a2bb90"} Jan 30 21:27:21 crc kubenswrapper[4914]: I0130 21:27:21.986159 4914 generic.go:334] "Generic (PLEG): container finished" podID="aab7512f-d12c-4b8c-b07b-45e27163ac4d" containerID="c2729a37e1bb7f58f4ce63e22df2ce144bf8d08796ecbc531c22e47989ff046c" exitCode=0 Jan 30 21:27:21 crc kubenswrapper[4914]: I0130 21:27:21.986249 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" event={"ID":"aab7512f-d12c-4b8c-b07b-45e27163ac4d","Type":"ContainerDied","Data":"c2729a37e1bb7f58f4ce63e22df2ce144bf8d08796ecbc531c22e47989ff046c"} Jan 30 21:27:21 crc kubenswrapper[4914]: I0130 21:27:21.988742 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6lb" event={"ID":"44eb32b5-f694-4d61-a5ec-57be19cddd47","Type":"ContainerStarted","Data":"47026f1bec1f5e58be78ee19d9f4e0fd56c8d6dd9af7e11dd1009c650a425002"} Jan 30 21:27:23 crc kubenswrapper[4914]: I0130 21:27:23.000147 4914 generic.go:334] "Generic (PLEG): container finished" podID="44eb32b5-f694-4d61-a5ec-57be19cddd47" containerID="47026f1bec1f5e58be78ee19d9f4e0fd56c8d6dd9af7e11dd1009c650a425002" exitCode=0 Jan 30 21:27:23 crc kubenswrapper[4914]: I0130 21:27:23.000209 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6lb" event={"ID":"44eb32b5-f694-4d61-a5ec-57be19cddd47","Type":"ContainerDied","Data":"47026f1bec1f5e58be78ee19d9f4e0fd56c8d6dd9af7e11dd1009c650a425002"} Jan 30 21:27:23 crc kubenswrapper[4914]: I0130 21:27:23.324530 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" Jan 30 21:27:23 crc kubenswrapper[4914]: I0130 21:27:23.327125 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jckz8\" (UniqueName: \"kubernetes.io/projected/aab7512f-d12c-4b8c-b07b-45e27163ac4d-kube-api-access-jckz8\") pod \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\" (UID: \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\") " Jan 30 21:27:23 crc kubenswrapper[4914]: I0130 21:27:23.327193 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab7512f-d12c-4b8c-b07b-45e27163ac4d-util\") pod \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\" (UID: \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\") " Jan 30 21:27:23 crc kubenswrapper[4914]: I0130 21:27:23.327249 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab7512f-d12c-4b8c-b07b-45e27163ac4d-bundle\") pod \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\" (UID: \"aab7512f-d12c-4b8c-b07b-45e27163ac4d\") " Jan 30 21:27:23 crc kubenswrapper[4914]: I0130 21:27:23.328415 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aab7512f-d12c-4b8c-b07b-45e27163ac4d-bundle" (OuterVolumeSpecName: "bundle") pod "aab7512f-d12c-4b8c-b07b-45e27163ac4d" (UID: "aab7512f-d12c-4b8c-b07b-45e27163ac4d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:27:23 crc kubenswrapper[4914]: I0130 21:27:23.332669 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab7512f-d12c-4b8c-b07b-45e27163ac4d-kube-api-access-jckz8" (OuterVolumeSpecName: "kube-api-access-jckz8") pod "aab7512f-d12c-4b8c-b07b-45e27163ac4d" (UID: "aab7512f-d12c-4b8c-b07b-45e27163ac4d"). InnerVolumeSpecName "kube-api-access-jckz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:27:23 crc kubenswrapper[4914]: I0130 21:27:23.344037 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aab7512f-d12c-4b8c-b07b-45e27163ac4d-util" (OuterVolumeSpecName: "util") pod "aab7512f-d12c-4b8c-b07b-45e27163ac4d" (UID: "aab7512f-d12c-4b8c-b07b-45e27163ac4d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:27:23 crc kubenswrapper[4914]: I0130 21:27:23.429195 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jckz8\" (UniqueName: \"kubernetes.io/projected/aab7512f-d12c-4b8c-b07b-45e27163ac4d-kube-api-access-jckz8\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:23 crc kubenswrapper[4914]: I0130 21:27:23.429432 4914 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab7512f-d12c-4b8c-b07b-45e27163ac4d-util\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:23 crc kubenswrapper[4914]: I0130 21:27:23.429443 4914 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab7512f-d12c-4b8c-b07b-45e27163ac4d-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:24 crc kubenswrapper[4914]: I0130 21:27:24.007003 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" Jan 30 21:27:24 crc kubenswrapper[4914]: I0130 21:27:24.007031 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g" event={"ID":"aab7512f-d12c-4b8c-b07b-45e27163ac4d","Type":"ContainerDied","Data":"6dc188381b8e85d7ca32349e3f796a90213c147540094494976b9c9868124202"} Jan 30 21:27:24 crc kubenswrapper[4914]: I0130 21:27:24.007070 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dc188381b8e85d7ca32349e3f796a90213c147540094494976b9c9868124202" Jan 30 21:27:24 crc kubenswrapper[4914]: I0130 21:27:24.009600 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6lb" event={"ID":"44eb32b5-f694-4d61-a5ec-57be19cddd47","Type":"ContainerStarted","Data":"4b4c3574219643540650f4127f9ec437ddb50c680628beeb2ff22c91fb4f0666"} Jan 30 21:27:24 crc kubenswrapper[4914]: I0130 21:27:24.030244 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sz6lb" podStartSLOduration=2.608183999 podStartE2EDuration="5.030228882s" podCreationTimestamp="2026-01-30 21:27:19 +0000 UTC" firstStartedPulling="2026-01-30 21:27:20.975092558 +0000 UTC m=+774.413729319" lastFinishedPulling="2026-01-30 21:27:23.397137441 +0000 UTC m=+776.835774202" observedRunningTime="2026-01-30 21:27:24.026286708 +0000 UTC m=+777.464923489" watchObservedRunningTime="2026-01-30 21:27:24.030228882 +0000 UTC m=+777.468865643" Jan 30 21:27:26 crc kubenswrapper[4914]: I0130 21:27:26.983218 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:27:26 crc kubenswrapper[4914]: I0130 21:27:26.983622 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:27:28 crc kubenswrapper[4914]: I0130 21:27:28.686379 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-9rpjx"] Jan 30 21:27:28 crc kubenswrapper[4914]: E0130 21:27:28.686923 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab7512f-d12c-4b8c-b07b-45e27163ac4d" containerName="pull" Jan 30 21:27:28 crc kubenswrapper[4914]: I0130 21:27:28.686938 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab7512f-d12c-4b8c-b07b-45e27163ac4d" containerName="pull" Jan 30 21:27:28 crc kubenswrapper[4914]: E0130 21:27:28.686955 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab7512f-d12c-4b8c-b07b-45e27163ac4d" containerName="extract" Jan 30 21:27:28 crc kubenswrapper[4914]: I0130 21:27:28.686963 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab7512f-d12c-4b8c-b07b-45e27163ac4d" containerName="extract" Jan 30 21:27:28 crc kubenswrapper[4914]: E0130 21:27:28.686973 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab7512f-d12c-4b8c-b07b-45e27163ac4d" containerName="util" Jan 30 21:27:28 crc kubenswrapper[4914]: I0130 21:27:28.686981 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab7512f-d12c-4b8c-b07b-45e27163ac4d" containerName="util" Jan 30 21:27:28 crc kubenswrapper[4914]: I0130 21:27:28.687091 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="aab7512f-d12c-4b8c-b07b-45e27163ac4d" containerName="extract" Jan 30 21:27:28 crc kubenswrapper[4914]: I0130 21:27:28.687511 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-9rpjx" Jan 30 21:27:28 crc kubenswrapper[4914]: I0130 21:27:28.689610 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 30 21:27:28 crc kubenswrapper[4914]: I0130 21:27:28.689922 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 30 21:27:28 crc kubenswrapper[4914]: I0130 21:27:28.692867 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-54rwv" Jan 30 21:27:28 crc kubenswrapper[4914]: I0130 21:27:28.706757 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-9rpjx"] Jan 30 21:27:28 crc kubenswrapper[4914]: I0130 21:27:28.798358 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjd4j\" (UniqueName: \"kubernetes.io/projected/48287b34-9e24-45ef-b31a-d8e32f405068-kube-api-access-mjd4j\") pod \"nmstate-operator-646758c888-9rpjx\" (UID: \"48287b34-9e24-45ef-b31a-d8e32f405068\") " pod="openshift-nmstate/nmstate-operator-646758c888-9rpjx" Jan 30 21:27:28 crc kubenswrapper[4914]: I0130 21:27:28.900023 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjd4j\" (UniqueName: \"kubernetes.io/projected/48287b34-9e24-45ef-b31a-d8e32f405068-kube-api-access-mjd4j\") pod \"nmstate-operator-646758c888-9rpjx\" (UID: \"48287b34-9e24-45ef-b31a-d8e32f405068\") " pod="openshift-nmstate/nmstate-operator-646758c888-9rpjx" Jan 30 21:27:28 crc kubenswrapper[4914]: I0130 21:27:28.921311 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjd4j\" (UniqueName: \"kubernetes.io/projected/48287b34-9e24-45ef-b31a-d8e32f405068-kube-api-access-mjd4j\") pod \"nmstate-operator-646758c888-9rpjx\" (UID: \"48287b34-9e24-45ef-b31a-d8e32f405068\") " pod="openshift-nmstate/nmstate-operator-646758c888-9rpjx" Jan 30 21:27:29 crc kubenswrapper[4914]: I0130 21:27:29.045137 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-9rpjx" Jan 30 21:27:29 crc kubenswrapper[4914]: I0130 21:27:29.249311 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-9rpjx"] Jan 30 21:27:29 crc kubenswrapper[4914]: W0130 21:27:29.285765 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48287b34_9e24_45ef_b31a_d8e32f405068.slice/crio-e93b2831cddc8587a278773f1f205b7c10ce89e05f7e280b0c8f30816f7019f4 WatchSource:0}: Error finding container e93b2831cddc8587a278773f1f205b7c10ce89e05f7e280b0c8f30816f7019f4: Status 404 returned error can't find the container with id e93b2831cddc8587a278773f1f205b7c10ce89e05f7e280b0c8f30816f7019f4 Jan 30 21:27:29 crc kubenswrapper[4914]: I0130 21:27:29.732521 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:29 crc kubenswrapper[4914]: I0130 21:27:29.732568 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:30 crc kubenswrapper[4914]: I0130 21:27:30.041227 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-9rpjx" event={"ID":"48287b34-9e24-45ef-b31a-d8e32f405068","Type":"ContainerStarted","Data":"e93b2831cddc8587a278773f1f205b7c10ce89e05f7e280b0c8f30816f7019f4"} Jan 30 21:27:30 crc kubenswrapper[4914]: I0130 21:27:30.769948 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sz6lb" podUID="44eb32b5-f694-4d61-a5ec-57be19cddd47" containerName="registry-server" probeResult="failure" output=< Jan 30 21:27:30 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:27:30 crc kubenswrapper[4914]: > Jan 30 21:27:33 crc kubenswrapper[4914]: I0130 21:27:33.063897 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-9rpjx" event={"ID":"48287b34-9e24-45ef-b31a-d8e32f405068","Type":"ContainerStarted","Data":"e86553469ec64db15b04581f170e9ec6a3a2a56757dca510505d8665a2e8dde9"} Jan 30 21:27:33 crc kubenswrapper[4914]: I0130 21:27:33.092953 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-9rpjx" podStartSLOduration=1.6003587910000001 podStartE2EDuration="5.092929602s" podCreationTimestamp="2026-01-30 21:27:28 +0000 UTC" firstStartedPulling="2026-01-30 21:27:29.288024509 +0000 UTC m=+782.726661270" lastFinishedPulling="2026-01-30 21:27:32.78059531 +0000 UTC m=+786.219232081" observedRunningTime="2026-01-30 21:27:33.090925467 +0000 UTC m=+786.529562258" watchObservedRunningTime="2026-01-30 21:27:33.092929602 +0000 UTC m=+786.531566403" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.068139 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-vrwjg"] Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.070341 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-vrwjg" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.077718 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-qxgwc" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.085114 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh"] Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.087370 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.118796 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.125362 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-vrwjg"] Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.133570 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh"] Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.150762 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-njdw4"] Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.151505 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.221367 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h"] Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.238782 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.241593 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a054e0a7-fe48-4adc-b216-d386c6ecd958-nmstate-lock\") pod \"nmstate-handler-njdw4\" (UID: \"a054e0a7-fe48-4adc-b216-d386c6ecd958\") " pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.241633 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/167400a6-ae93-41a2-a825-ba7bd5984a12-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-lfjgh\" (UID: \"167400a6-ae93-41a2-a825-ba7bd5984a12\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.241660 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a054e0a7-fe48-4adc-b216-d386c6ecd958-ovs-socket\") pod \"nmstate-handler-njdw4\" (UID: \"a054e0a7-fe48-4adc-b216-d386c6ecd958\") " pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.241666 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h"] Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.241697 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdqln\" (UniqueName: \"kubernetes.io/projected/167400a6-ae93-41a2-a825-ba7bd5984a12-kube-api-access-qdqln\") pod \"nmstate-webhook-8474b5b9d8-lfjgh\" (UID: \"167400a6-ae93-41a2-a825-ba7bd5984a12\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.241754 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qmkm\" (UniqueName: \"kubernetes.io/projected/a054e0a7-fe48-4adc-b216-d386c6ecd958-kube-api-access-7qmkm\") pod \"nmstate-handler-njdw4\" (UID: \"a054e0a7-fe48-4adc-b216-d386c6ecd958\") " pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.241773 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.241787 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kdvv\" (UniqueName: \"kubernetes.io/projected/5e8e35e1-e28f-47ac-b6d2-d51a7da04d2d-kube-api-access-7kdvv\") pod \"nmstate-metrics-54757c584b-vrwjg\" (UID: \"5e8e35e1-e28f-47ac-b6d2-d51a7da04d2d\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-vrwjg" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.241802 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a054e0a7-fe48-4adc-b216-d386c6ecd958-dbus-socket\") pod \"nmstate-handler-njdw4\" (UID: \"a054e0a7-fe48-4adc-b216-d386c6ecd958\") " pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.242311 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.245809 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-ztr87" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.342447 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a054e0a7-fe48-4adc-b216-d386c6ecd958-dbus-socket\") pod \"nmstate-handler-njdw4\" (UID: \"a054e0a7-fe48-4adc-b216-d386c6ecd958\") " pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.342500 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce26ef88-3c09-4fe1-bb28-56fceef865fb-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-29k7h\" (UID: \"ce26ef88-3c09-4fe1-bb28-56fceef865fb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.342523 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a054e0a7-fe48-4adc-b216-d386c6ecd958-nmstate-lock\") pod \"nmstate-handler-njdw4\" (UID: \"a054e0a7-fe48-4adc-b216-d386c6ecd958\") " pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.342544 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/167400a6-ae93-41a2-a825-ba7bd5984a12-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-lfjgh\" (UID: \"167400a6-ae93-41a2-a825-ba7bd5984a12\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.342567 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a054e0a7-fe48-4adc-b216-d386c6ecd958-ovs-socket\") pod \"nmstate-handler-njdw4\" (UID: \"a054e0a7-fe48-4adc-b216-d386c6ecd958\") " pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.342609 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a054e0a7-fe48-4adc-b216-d386c6ecd958-ovs-socket\") pod \"nmstate-handler-njdw4\" (UID: \"a054e0a7-fe48-4adc-b216-d386c6ecd958\") " pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.342681 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ce26ef88-3c09-4fe1-bb28-56fceef865fb-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-29k7h\" (UID: \"ce26ef88-3c09-4fe1-bb28-56fceef865fb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.342681 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a054e0a7-fe48-4adc-b216-d386c6ecd958-nmstate-lock\") pod \"nmstate-handler-njdw4\" (UID: \"a054e0a7-fe48-4adc-b216-d386c6ecd958\") " pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.342726 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdqln\" (UniqueName: \"kubernetes.io/projected/167400a6-ae93-41a2-a825-ba7bd5984a12-kube-api-access-qdqln\") pod \"nmstate-webhook-8474b5b9d8-lfjgh\" (UID: \"167400a6-ae93-41a2-a825-ba7bd5984a12\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.342796 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qmkm\" (UniqueName: \"kubernetes.io/projected/a054e0a7-fe48-4adc-b216-d386c6ecd958-kube-api-access-7qmkm\") pod \"nmstate-handler-njdw4\" (UID: \"a054e0a7-fe48-4adc-b216-d386c6ecd958\") " pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.342817 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a054e0a7-fe48-4adc-b216-d386c6ecd958-dbus-socket\") pod \"nmstate-handler-njdw4\" (UID: \"a054e0a7-fe48-4adc-b216-d386c6ecd958\") " pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.342925 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxvwk\" (UniqueName: \"kubernetes.io/projected/ce26ef88-3c09-4fe1-bb28-56fceef865fb-kube-api-access-hxvwk\") pod \"nmstate-console-plugin-7754f76f8b-29k7h\" (UID: \"ce26ef88-3c09-4fe1-bb28-56fceef865fb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.342976 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kdvv\" (UniqueName: \"kubernetes.io/projected/5e8e35e1-e28f-47ac-b6d2-d51a7da04d2d-kube-api-access-7kdvv\") pod \"nmstate-metrics-54757c584b-vrwjg\" (UID: \"5e8e35e1-e28f-47ac-b6d2-d51a7da04d2d\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-vrwjg" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.351631 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/167400a6-ae93-41a2-a825-ba7bd5984a12-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-lfjgh\" (UID: \"167400a6-ae93-41a2-a825-ba7bd5984a12\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.363406 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdqln\" (UniqueName: \"kubernetes.io/projected/167400a6-ae93-41a2-a825-ba7bd5984a12-kube-api-access-qdqln\") pod \"nmstate-webhook-8474b5b9d8-lfjgh\" (UID: \"167400a6-ae93-41a2-a825-ba7bd5984a12\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.367631 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kdvv\" (UniqueName: \"kubernetes.io/projected/5e8e35e1-e28f-47ac-b6d2-d51a7da04d2d-kube-api-access-7kdvv\") pod \"nmstate-metrics-54757c584b-vrwjg\" (UID: \"5e8e35e1-e28f-47ac-b6d2-d51a7da04d2d\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-vrwjg" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.384556 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qmkm\" (UniqueName: \"kubernetes.io/projected/a054e0a7-fe48-4adc-b216-d386c6ecd958-kube-api-access-7qmkm\") pod \"nmstate-handler-njdw4\" (UID: \"a054e0a7-fe48-4adc-b216-d386c6ecd958\") " pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.389574 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-vrwjg" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.432106 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5dc57c7dfc-c7mfm"] Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.432925 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.433514 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.448146 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ce26ef88-3c09-4fe1-bb28-56fceef865fb-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-29k7h\" (UID: \"ce26ef88-3c09-4fe1-bb28-56fceef865fb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.448212 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxvwk\" (UniqueName: \"kubernetes.io/projected/ce26ef88-3c09-4fe1-bb28-56fceef865fb-kube-api-access-hxvwk\") pod \"nmstate-console-plugin-7754f76f8b-29k7h\" (UID: \"ce26ef88-3c09-4fe1-bb28-56fceef865fb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.448246 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce26ef88-3c09-4fe1-bb28-56fceef865fb-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-29k7h\" (UID: \"ce26ef88-3c09-4fe1-bb28-56fceef865fb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.449250 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dc57c7dfc-c7mfm"] Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.450574 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ce26ef88-3c09-4fe1-bb28-56fceef865fb-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-29k7h\" (UID: \"ce26ef88-3c09-4fe1-bb28-56fceef865fb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.456559 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce26ef88-3c09-4fe1-bb28-56fceef865fb-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-29k7h\" (UID: \"ce26ef88-3c09-4fe1-bb28-56fceef865fb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.469592 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxvwk\" (UniqueName: \"kubernetes.io/projected/ce26ef88-3c09-4fe1-bb28-56fceef865fb-kube-api-access-hxvwk\") pod \"nmstate-console-plugin-7754f76f8b-29k7h\" (UID: \"ce26ef88-3c09-4fe1-bb28-56fceef865fb\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.470252 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:39 crc kubenswrapper[4914]: W0130 21:27:39.491875 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda054e0a7_fe48_4adc_b216_d386c6ecd958.slice/crio-0b25ac2ad674e03a21a55745ef0362d76b31e8b0730e0c2e2ccf0d2fc08010e8 WatchSource:0}: Error finding container 0b25ac2ad674e03a21a55745ef0362d76b31e8b0730e0c2e2ccf0d2fc08010e8: Status 404 returned error can't find the container with id 0b25ac2ad674e03a21a55745ef0362d76b31e8b0730e0c2e2ccf0d2fc08010e8 Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.549977 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-trusted-ca-bundle\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.550476 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-console-serving-cert\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.550507 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75g4h\" (UniqueName: \"kubernetes.io/projected/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-kube-api-access-75g4h\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.550530 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-console-oauth-config\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.550558 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-oauth-serving-cert\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.550627 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-service-ca\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.550643 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-console-config\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.564306 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.651609 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-service-ca\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.651642 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-console-config\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.651672 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-trusted-ca-bundle\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.651692 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-console-serving-cert\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.651737 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75g4h\" (UniqueName: \"kubernetes.io/projected/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-kube-api-access-75g4h\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.651761 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-console-oauth-config\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.651788 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-oauth-serving-cert\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.652658 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-oauth-serving-cert\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.652727 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-service-ca\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.653347 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-console-config\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.654525 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-trusted-ca-bundle\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.664912 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-console-oauth-config\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.664959 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-console-serving-cert\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.671367 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75g4h\" (UniqueName: \"kubernetes.io/projected/3f9faf2e-2543-4f00-bf1e-a82c8d9f744c-kube-api-access-75g4h\") pod \"console-5dc57c7dfc-c7mfm\" (UID: \"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c\") " pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.766209 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.781423 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.835359 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.844525 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-vrwjg"] Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.870724 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh"] Jan 30 21:27:39 crc kubenswrapper[4914]: W0130 21:27:39.875999 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod167400a6_ae93_41a2_a825_ba7bd5984a12.slice/crio-72e4456d652050b2647ae56f6173aa4640630d72c354b60866648a9cff0529f8 WatchSource:0}: Error finding container 72e4456d652050b2647ae56f6173aa4640630d72c354b60866648a9cff0529f8: Status 404 returned error can't find the container with id 72e4456d652050b2647ae56f6173aa4640630d72c354b60866648a9cff0529f8 Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.956772 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h"] Jan 30 21:27:39 crc kubenswrapper[4914]: I0130 21:27:39.968051 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dc57c7dfc-c7mfm"] Jan 30 21:27:40 crc kubenswrapper[4914]: I0130 21:27:40.012343 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sz6lb"] Jan 30 21:27:40 crc kubenswrapper[4914]: I0130 21:27:40.129632 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-njdw4" event={"ID":"a054e0a7-fe48-4adc-b216-d386c6ecd958","Type":"ContainerStarted","Data":"0b25ac2ad674e03a21a55745ef0362d76b31e8b0730e0c2e2ccf0d2fc08010e8"} Jan 30 21:27:40 crc kubenswrapper[4914]: I0130 21:27:40.130613 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h" event={"ID":"ce26ef88-3c09-4fe1-bb28-56fceef865fb","Type":"ContainerStarted","Data":"f38b697091b19bf431760bfa2258fcf28df7bd8a2ad91865eb4763229f4d1038"} Jan 30 21:27:40 crc kubenswrapper[4914]: I0130 21:27:40.131797 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-vrwjg" event={"ID":"5e8e35e1-e28f-47ac-b6d2-d51a7da04d2d","Type":"ContainerStarted","Data":"35a778023e7bff194acde562a0acfbbc0d88d1524be1b2539fa37210913ed4d1"} Jan 30 21:27:40 crc kubenswrapper[4914]: I0130 21:27:40.132645 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh" event={"ID":"167400a6-ae93-41a2-a825-ba7bd5984a12","Type":"ContainerStarted","Data":"72e4456d652050b2647ae56f6173aa4640630d72c354b60866648a9cff0529f8"} Jan 30 21:27:40 crc kubenswrapper[4914]: I0130 21:27:40.134298 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dc57c7dfc-c7mfm" event={"ID":"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c","Type":"ContainerStarted","Data":"b8f7c38a0a2c2c5e7994ac82b4d5fc3a8fb33ce31670cf1c3e1ac97b7253a68c"} Jan 30 21:27:40 crc kubenswrapper[4914]: I0130 21:27:40.134341 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dc57c7dfc-c7mfm" event={"ID":"3f9faf2e-2543-4f00-bf1e-a82c8d9f744c","Type":"ContainerStarted","Data":"a27065963bc0ce32d51c2b77c88b807a186fc3ce268db29fa1bb44f28a1823de"} Jan 30 21:27:40 crc kubenswrapper[4914]: I0130 21:27:40.157002 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5dc57c7dfc-c7mfm" podStartSLOduration=1.156973877 podStartE2EDuration="1.156973877s" podCreationTimestamp="2026-01-30 21:27:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:27:40.149506328 +0000 UTC m=+793.588143099" watchObservedRunningTime="2026-01-30 21:27:40.156973877 +0000 UTC m=+793.595610648" Jan 30 21:27:41 crc kubenswrapper[4914]: I0130 21:27:41.141338 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sz6lb" podUID="44eb32b5-f694-4d61-a5ec-57be19cddd47" containerName="registry-server" containerID="cri-o://4b4c3574219643540650f4127f9ec437ddb50c680628beeb2ff22c91fb4f0666" gracePeriod=2 Jan 30 21:27:41 crc kubenswrapper[4914]: I0130 21:27:41.542627 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:41 crc kubenswrapper[4914]: I0130 21:27:41.687461 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44eb32b5-f694-4d61-a5ec-57be19cddd47-utilities\") pod \"44eb32b5-f694-4d61-a5ec-57be19cddd47\" (UID: \"44eb32b5-f694-4d61-a5ec-57be19cddd47\") " Jan 30 21:27:41 crc kubenswrapper[4914]: I0130 21:27:41.687602 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44eb32b5-f694-4d61-a5ec-57be19cddd47-catalog-content\") pod \"44eb32b5-f694-4d61-a5ec-57be19cddd47\" (UID: \"44eb32b5-f694-4d61-a5ec-57be19cddd47\") " Jan 30 21:27:41 crc kubenswrapper[4914]: I0130 21:27:41.687698 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7slx\" (UniqueName: \"kubernetes.io/projected/44eb32b5-f694-4d61-a5ec-57be19cddd47-kube-api-access-t7slx\") pod \"44eb32b5-f694-4d61-a5ec-57be19cddd47\" (UID: \"44eb32b5-f694-4d61-a5ec-57be19cddd47\") " Jan 30 21:27:41 crc kubenswrapper[4914]: I0130 21:27:41.688554 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44eb32b5-f694-4d61-a5ec-57be19cddd47-utilities" (OuterVolumeSpecName: "utilities") pod "44eb32b5-f694-4d61-a5ec-57be19cddd47" (UID: "44eb32b5-f694-4d61-a5ec-57be19cddd47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4914]: I0130 21:27:41.696039 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44eb32b5-f694-4d61-a5ec-57be19cddd47-kube-api-access-t7slx" (OuterVolumeSpecName: "kube-api-access-t7slx") pod "44eb32b5-f694-4d61-a5ec-57be19cddd47" (UID: "44eb32b5-f694-4d61-a5ec-57be19cddd47"). InnerVolumeSpecName "kube-api-access-t7slx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4914]: I0130 21:27:41.789506 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7slx\" (UniqueName: \"kubernetes.io/projected/44eb32b5-f694-4d61-a5ec-57be19cddd47-kube-api-access-t7slx\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4914]: I0130 21:27:41.789540 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44eb32b5-f694-4d61-a5ec-57be19cddd47-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4914]: I0130 21:27:41.809220 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44eb32b5-f694-4d61-a5ec-57be19cddd47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44eb32b5-f694-4d61-a5ec-57be19cddd47" (UID: "44eb32b5-f694-4d61-a5ec-57be19cddd47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4914]: I0130 21:27:41.891563 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44eb32b5-f694-4d61-a5ec-57be19cddd47-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:42 crc kubenswrapper[4914]: I0130 21:27:42.152271 4914 generic.go:334] "Generic (PLEG): container finished" podID="44eb32b5-f694-4d61-a5ec-57be19cddd47" containerID="4b4c3574219643540650f4127f9ec437ddb50c680628beeb2ff22c91fb4f0666" exitCode=0 Jan 30 21:27:42 crc kubenswrapper[4914]: I0130 21:27:42.152503 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6lb" event={"ID":"44eb32b5-f694-4d61-a5ec-57be19cddd47","Type":"ContainerDied","Data":"4b4c3574219643540650f4127f9ec437ddb50c680628beeb2ff22c91fb4f0666"} Jan 30 21:27:42 crc kubenswrapper[4914]: I0130 21:27:42.152683 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6lb" event={"ID":"44eb32b5-f694-4d61-a5ec-57be19cddd47","Type":"ContainerDied","Data":"46db89869026ed5ba324027383fc2c745eba25de9152474ed4157bb4c0b26b94"} Jan 30 21:27:42 crc kubenswrapper[4914]: I0130 21:27:42.152739 4914 scope.go:117] "RemoveContainer" containerID="4b4c3574219643540650f4127f9ec437ddb50c680628beeb2ff22c91fb4f0666" Jan 30 21:27:42 crc kubenswrapper[4914]: I0130 21:27:42.152629 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sz6lb" Jan 30 21:27:42 crc kubenswrapper[4914]: I0130 21:27:42.178273 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sz6lb"] Jan 30 21:27:42 crc kubenswrapper[4914]: I0130 21:27:42.183449 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sz6lb"] Jan 30 21:27:42 crc kubenswrapper[4914]: I0130 21:27:42.901567 4914 scope.go:117] "RemoveContainer" containerID="47026f1bec1f5e58be78ee19d9f4e0fd56c8d6dd9af7e11dd1009c650a425002" Jan 30 21:27:42 crc kubenswrapper[4914]: I0130 21:27:42.941101 4914 scope.go:117] "RemoveContainer" containerID="8ef1409dcbb3ca0727aa568c4a2373df3b1821334a652d7c3792effc10fe0e78" Jan 30 21:27:43 crc kubenswrapper[4914]: I0130 21:27:43.096276 4914 scope.go:117] "RemoveContainer" containerID="4b4c3574219643540650f4127f9ec437ddb50c680628beeb2ff22c91fb4f0666" Jan 30 21:27:43 crc kubenswrapper[4914]: E0130 21:27:43.096727 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b4c3574219643540650f4127f9ec437ddb50c680628beeb2ff22c91fb4f0666\": container with ID starting with 4b4c3574219643540650f4127f9ec437ddb50c680628beeb2ff22c91fb4f0666 not found: ID does not exist" containerID="4b4c3574219643540650f4127f9ec437ddb50c680628beeb2ff22c91fb4f0666" Jan 30 21:27:43 crc kubenswrapper[4914]: I0130 21:27:43.096783 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b4c3574219643540650f4127f9ec437ddb50c680628beeb2ff22c91fb4f0666"} err="failed to get container status \"4b4c3574219643540650f4127f9ec437ddb50c680628beeb2ff22c91fb4f0666\": rpc error: code = NotFound desc = could not find container \"4b4c3574219643540650f4127f9ec437ddb50c680628beeb2ff22c91fb4f0666\": container with ID starting with 4b4c3574219643540650f4127f9ec437ddb50c680628beeb2ff22c91fb4f0666 not found: ID does not exist" Jan 30 21:27:43 crc kubenswrapper[4914]: I0130 21:27:43.096814 4914 scope.go:117] "RemoveContainer" containerID="47026f1bec1f5e58be78ee19d9f4e0fd56c8d6dd9af7e11dd1009c650a425002" Jan 30 21:27:43 crc kubenswrapper[4914]: E0130 21:27:43.097371 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47026f1bec1f5e58be78ee19d9f4e0fd56c8d6dd9af7e11dd1009c650a425002\": container with ID starting with 47026f1bec1f5e58be78ee19d9f4e0fd56c8d6dd9af7e11dd1009c650a425002 not found: ID does not exist" containerID="47026f1bec1f5e58be78ee19d9f4e0fd56c8d6dd9af7e11dd1009c650a425002" Jan 30 21:27:43 crc kubenswrapper[4914]: I0130 21:27:43.097418 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47026f1bec1f5e58be78ee19d9f4e0fd56c8d6dd9af7e11dd1009c650a425002"} err="failed to get container status \"47026f1bec1f5e58be78ee19d9f4e0fd56c8d6dd9af7e11dd1009c650a425002\": rpc error: code = NotFound desc = could not find container \"47026f1bec1f5e58be78ee19d9f4e0fd56c8d6dd9af7e11dd1009c650a425002\": container with ID starting with 47026f1bec1f5e58be78ee19d9f4e0fd56c8d6dd9af7e11dd1009c650a425002 not found: ID does not exist" Jan 30 21:27:43 crc kubenswrapper[4914]: I0130 21:27:43.097453 4914 scope.go:117] "RemoveContainer" containerID="8ef1409dcbb3ca0727aa568c4a2373df3b1821334a652d7c3792effc10fe0e78" Jan 30 21:27:43 crc kubenswrapper[4914]: E0130 21:27:43.097810 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef1409dcbb3ca0727aa568c4a2373df3b1821334a652d7c3792effc10fe0e78\": container with ID starting with 8ef1409dcbb3ca0727aa568c4a2373df3b1821334a652d7c3792effc10fe0e78 not found: ID does not exist" containerID="8ef1409dcbb3ca0727aa568c4a2373df3b1821334a652d7c3792effc10fe0e78" Jan 30 21:27:43 crc kubenswrapper[4914]: I0130 21:27:43.097836 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef1409dcbb3ca0727aa568c4a2373df3b1821334a652d7c3792effc10fe0e78"} err="failed to get container status \"8ef1409dcbb3ca0727aa568c4a2373df3b1821334a652d7c3792effc10fe0e78\": rpc error: code = NotFound desc = could not find container \"8ef1409dcbb3ca0727aa568c4a2373df3b1821334a652d7c3792effc10fe0e78\": container with ID starting with 8ef1409dcbb3ca0727aa568c4a2373df3b1821334a652d7c3792effc10fe0e78 not found: ID does not exist" Jan 30 21:27:43 crc kubenswrapper[4914]: I0130 21:27:43.832349 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44eb32b5-f694-4d61-a5ec-57be19cddd47" path="/var/lib/kubelet/pods/44eb32b5-f694-4d61-a5ec-57be19cddd47/volumes" Jan 30 21:27:44 crc kubenswrapper[4914]: I0130 21:27:44.174246 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h" event={"ID":"ce26ef88-3c09-4fe1-bb28-56fceef865fb","Type":"ContainerStarted","Data":"8d4383ebe394c1901d155d49d5dcf0368ed9a4b652eee19d8eaac10be01e3cfb"} Jan 30 21:27:44 crc kubenswrapper[4914]: I0130 21:27:44.176166 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-njdw4" event={"ID":"a054e0a7-fe48-4adc-b216-d386c6ecd958","Type":"ContainerStarted","Data":"f290dedc936c1586efe5632c75fdce7d3c55696d5b4742ee1e8c0b1f52885fde"} Jan 30 21:27:44 crc kubenswrapper[4914]: I0130 21:27:44.178370 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-vrwjg" event={"ID":"5e8e35e1-e28f-47ac-b6d2-d51a7da04d2d","Type":"ContainerStarted","Data":"9fe545a0f728ed4c98e3933d02465c825ec6689e4c36b86c2b595fb382c4eddd"} Jan 30 21:27:44 crc kubenswrapper[4914]: I0130 21:27:44.179559 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh" event={"ID":"167400a6-ae93-41a2-a825-ba7bd5984a12","Type":"ContainerStarted","Data":"c8aef6f71eaee740717b9e09909cec8ed7ddc8e86fe03d4b570d84a6e996609d"} Jan 30 21:27:44 crc kubenswrapper[4914]: I0130 21:27:44.179855 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh" Jan 30 21:27:44 crc kubenswrapper[4914]: I0130 21:27:44.190110 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-29k7h" podStartSLOduration=2.217299045 podStartE2EDuration="5.190089193s" podCreationTimestamp="2026-01-30 21:27:39 +0000 UTC" firstStartedPulling="2026-01-30 21:27:39.974404895 +0000 UTC m=+793.413041646" lastFinishedPulling="2026-01-30 21:27:42.947195033 +0000 UTC m=+796.385831794" observedRunningTime="2026-01-30 21:27:44.187272819 +0000 UTC m=+797.625909600" watchObservedRunningTime="2026-01-30 21:27:44.190089193 +0000 UTC m=+797.628725964" Jan 30 21:27:44 crc kubenswrapper[4914]: I0130 21:27:44.238314 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-njdw4" podStartSLOduration=1.784258848 podStartE2EDuration="5.238292421s" podCreationTimestamp="2026-01-30 21:27:39 +0000 UTC" firstStartedPulling="2026-01-30 21:27:39.494099561 +0000 UTC m=+792.932736322" lastFinishedPulling="2026-01-30 21:27:42.948133104 +0000 UTC m=+796.386769895" observedRunningTime="2026-01-30 21:27:44.20990476 +0000 UTC m=+797.648541521" watchObservedRunningTime="2026-01-30 21:27:44.238292421 +0000 UTC m=+797.676929192" Jan 30 21:27:44 crc kubenswrapper[4914]: I0130 21:27:44.239603 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh" podStartSLOduration=2.166645351 podStartE2EDuration="5.23959649s" podCreationTimestamp="2026-01-30 21:27:39 +0000 UTC" firstStartedPulling="2026-01-30 21:27:39.878261424 +0000 UTC m=+793.316898185" lastFinishedPulling="2026-01-30 21:27:42.951212553 +0000 UTC m=+796.389849324" observedRunningTime="2026-01-30 21:27:44.228376947 +0000 UTC m=+797.667013728" watchObservedRunningTime="2026-01-30 21:27:44.23959649 +0000 UTC m=+797.678233271" Jan 30 21:27:44 crc kubenswrapper[4914]: I0130 21:27:44.471242 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:46 crc kubenswrapper[4914]: I0130 21:27:46.199690 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-vrwjg" event={"ID":"5e8e35e1-e28f-47ac-b6d2-d51a7da04d2d","Type":"ContainerStarted","Data":"733952cf051f31c83fec6d12aae918812296ade1169a097a0762c60286446e6e"} Jan 30 21:27:46 crc kubenswrapper[4914]: I0130 21:27:46.227818 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-vrwjg" podStartSLOduration=1.4185973729999999 podStartE2EDuration="7.227795078s" podCreationTimestamp="2026-01-30 21:27:39 +0000 UTC" firstStartedPulling="2026-01-30 21:27:39.86036782 +0000 UTC m=+793.299004581" lastFinishedPulling="2026-01-30 21:27:45.669565515 +0000 UTC m=+799.108202286" observedRunningTime="2026-01-30 21:27:46.225476486 +0000 UTC m=+799.664113317" watchObservedRunningTime="2026-01-30 21:27:46.227795078 +0000 UTC m=+799.666431879" Jan 30 21:27:49 crc kubenswrapper[4914]: I0130 21:27:49.505140 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-njdw4" Jan 30 21:27:49 crc kubenswrapper[4914]: I0130 21:27:49.767510 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:49 crc kubenswrapper[4914]: I0130 21:27:49.767589 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:49 crc kubenswrapper[4914]: I0130 21:27:49.775396 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:50 crc kubenswrapper[4914]: I0130 21:27:50.247525 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5dc57c7dfc-c7mfm" Jan 30 21:27:50 crc kubenswrapper[4914]: I0130 21:27:50.322385 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-scclv"] Jan 30 21:27:56 crc kubenswrapper[4914]: I0130 21:27:56.983372 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:27:56 crc kubenswrapper[4914]: I0130 21:27:56.984184 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:27:59 crc kubenswrapper[4914]: I0130 21:27:59.443080 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-lfjgh" Jan 30 21:28:15 crc kubenswrapper[4914]: I0130 21:28:15.388337 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-scclv" podUID="77a21683-69d1-4459-aa95-cf4f0d33ec19" containerName="console" containerID="cri-o://671cf4ea7c0f5d3ac6f4f9fdc516b550b7915210c32d3f66a2c947d7d6bf3547" gracePeriod=15 Jan 30 21:28:15 crc kubenswrapper[4914]: I0130 21:28:15.868037 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-scclv_77a21683-69d1-4459-aa95-cf4f0d33ec19/console/0.log" Jan 30 21:28:15 crc kubenswrapper[4914]: I0130 21:28:15.868347 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.014476 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzvbx\" (UniqueName: \"kubernetes.io/projected/77a21683-69d1-4459-aa95-cf4f0d33ec19-kube-api-access-gzvbx\") pod \"77a21683-69d1-4459-aa95-cf4f0d33ec19\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.014548 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-service-ca\") pod \"77a21683-69d1-4459-aa95-cf4f0d33ec19\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.014568 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-trusted-ca-bundle\") pod \"77a21683-69d1-4459-aa95-cf4f0d33ec19\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.014597 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-oauth-serving-cert\") pod \"77a21683-69d1-4459-aa95-cf4f0d33ec19\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.014659 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-oauth-config\") pod \"77a21683-69d1-4459-aa95-cf4f0d33ec19\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.014697 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-config\") pod \"77a21683-69d1-4459-aa95-cf4f0d33ec19\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.014737 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-serving-cert\") pod \"77a21683-69d1-4459-aa95-cf4f0d33ec19\" (UID: \"77a21683-69d1-4459-aa95-cf4f0d33ec19\") " Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.015110 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-service-ca" (OuterVolumeSpecName: "service-ca") pod "77a21683-69d1-4459-aa95-cf4f0d33ec19" (UID: "77a21683-69d1-4459-aa95-cf4f0d33ec19"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.015120 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "77a21683-69d1-4459-aa95-cf4f0d33ec19" (UID: "77a21683-69d1-4459-aa95-cf4f0d33ec19"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.015176 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "77a21683-69d1-4459-aa95-cf4f0d33ec19" (UID: "77a21683-69d1-4459-aa95-cf4f0d33ec19"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.015505 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-config" (OuterVolumeSpecName: "console-config") pod "77a21683-69d1-4459-aa95-cf4f0d33ec19" (UID: "77a21683-69d1-4459-aa95-cf4f0d33ec19"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.024441 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "77a21683-69d1-4459-aa95-cf4f0d33ec19" (UID: "77a21683-69d1-4459-aa95-cf4f0d33ec19"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.024592 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77a21683-69d1-4459-aa95-cf4f0d33ec19-kube-api-access-gzvbx" (OuterVolumeSpecName: "kube-api-access-gzvbx") pod "77a21683-69d1-4459-aa95-cf4f0d33ec19" (UID: "77a21683-69d1-4459-aa95-cf4f0d33ec19"). InnerVolumeSpecName "kube-api-access-gzvbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.025094 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "77a21683-69d1-4459-aa95-cf4f0d33ec19" (UID: "77a21683-69d1-4459-aa95-cf4f0d33ec19"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.115809 4914 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.115843 4914 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.115853 4914 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77a21683-69d1-4459-aa95-cf4f0d33ec19-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.115862 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzvbx\" (UniqueName: \"kubernetes.io/projected/77a21683-69d1-4459-aa95-cf4f0d33ec19-kube-api-access-gzvbx\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.115872 4914 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.115880 4914 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.115890 4914 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77a21683-69d1-4459-aa95-cf4f0d33ec19-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.392739 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm"] Jan 30 21:28:16 crc kubenswrapper[4914]: E0130 21:28:16.394007 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a21683-69d1-4459-aa95-cf4f0d33ec19" containerName="console" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.394127 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a21683-69d1-4459-aa95-cf4f0d33ec19" containerName="console" Jan 30 21:28:16 crc kubenswrapper[4914]: E0130 21:28:16.394199 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44eb32b5-f694-4d61-a5ec-57be19cddd47" containerName="registry-server" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.394268 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="44eb32b5-f694-4d61-a5ec-57be19cddd47" containerName="registry-server" Jan 30 21:28:16 crc kubenswrapper[4914]: E0130 21:28:16.394354 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44eb32b5-f694-4d61-a5ec-57be19cddd47" containerName="extract-content" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.394430 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="44eb32b5-f694-4d61-a5ec-57be19cddd47" containerName="extract-content" Jan 30 21:28:16 crc kubenswrapper[4914]: E0130 21:28:16.394509 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44eb32b5-f694-4d61-a5ec-57be19cddd47" containerName="extract-utilities" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.394743 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="44eb32b5-f694-4d61-a5ec-57be19cddd47" containerName="extract-utilities" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.395030 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="44eb32b5-f694-4d61-a5ec-57be19cddd47" containerName="registry-server" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.395124 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a21683-69d1-4459-aa95-cf4f0d33ec19" containerName="console" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.396143 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.399594 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.405192 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm"] Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.469084 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-scclv_77a21683-69d1-4459-aa95-cf4f0d33ec19/console/0.log" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.469342 4914 generic.go:334] "Generic (PLEG): container finished" podID="77a21683-69d1-4459-aa95-cf4f0d33ec19" containerID="671cf4ea7c0f5d3ac6f4f9fdc516b550b7915210c32d3f66a2c947d7d6bf3547" exitCode=2 Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.469486 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-scclv" event={"ID":"77a21683-69d1-4459-aa95-cf4f0d33ec19","Type":"ContainerDied","Data":"671cf4ea7c0f5d3ac6f4f9fdc516b550b7915210c32d3f66a2c947d7d6bf3547"} Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.469586 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-scclv" event={"ID":"77a21683-69d1-4459-aa95-cf4f0d33ec19","Type":"ContainerDied","Data":"1a15303484b5e02914a3a24ffb174692567f7691ff5cc6f589dc0c95044b11fa"} Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.469669 4914 scope.go:117] "RemoveContainer" containerID="671cf4ea7c0f5d3ac6f4f9fdc516b550b7915210c32d3f66a2c947d7d6bf3547" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.469887 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-scclv" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.493613 4914 scope.go:117] "RemoveContainer" containerID="671cf4ea7c0f5d3ac6f4f9fdc516b550b7915210c32d3f66a2c947d7d6bf3547" Jan 30 21:28:16 crc kubenswrapper[4914]: E0130 21:28:16.494208 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"671cf4ea7c0f5d3ac6f4f9fdc516b550b7915210c32d3f66a2c947d7d6bf3547\": container with ID starting with 671cf4ea7c0f5d3ac6f4f9fdc516b550b7915210c32d3f66a2c947d7d6bf3547 not found: ID does not exist" containerID="671cf4ea7c0f5d3ac6f4f9fdc516b550b7915210c32d3f66a2c947d7d6bf3547" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.494334 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"671cf4ea7c0f5d3ac6f4f9fdc516b550b7915210c32d3f66a2c947d7d6bf3547"} err="failed to get container status \"671cf4ea7c0f5d3ac6f4f9fdc516b550b7915210c32d3f66a2c947d7d6bf3547\": rpc error: code = NotFound desc = could not find container \"671cf4ea7c0f5d3ac6f4f9fdc516b550b7915210c32d3f66a2c947d7d6bf3547\": container with ID starting with 671cf4ea7c0f5d3ac6f4f9fdc516b550b7915210c32d3f66a2c947d7d6bf3547 not found: ID does not exist" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.497178 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-scclv"] Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.503772 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-scclv"] Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.521454 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm\" (UID: \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.521819 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm\" (UID: \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.522059 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r558\" (UniqueName: \"kubernetes.io/projected/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-kube-api-access-4r558\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm\" (UID: \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.623245 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r558\" (UniqueName: \"kubernetes.io/projected/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-kube-api-access-4r558\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm\" (UID: \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.623329 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm\" (UID: \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.623379 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm\" (UID: \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.624044 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm\" (UID: \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.624064 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm\" (UID: \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.646116 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r558\" (UniqueName: \"kubernetes.io/projected/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-kube-api-access-4r558\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm\" (UID: \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" Jan 30 21:28:16 crc kubenswrapper[4914]: I0130 21:28:16.749586 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" Jan 30 21:28:17 crc kubenswrapper[4914]: I0130 21:28:17.039343 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm"] Jan 30 21:28:17 crc kubenswrapper[4914]: I0130 21:28:17.481154 4914 generic.go:334] "Generic (PLEG): container finished" podID="c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5" containerID="465e81886d84cd59e4eae01dd5f13f286295d230323746ae7f1e0bf31fc7992d" exitCode=0 Jan 30 21:28:17 crc kubenswrapper[4914]: I0130 21:28:17.481242 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" event={"ID":"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5","Type":"ContainerDied","Data":"465e81886d84cd59e4eae01dd5f13f286295d230323746ae7f1e0bf31fc7992d"} Jan 30 21:28:17 crc kubenswrapper[4914]: I0130 21:28:17.481335 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" event={"ID":"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5","Type":"ContainerStarted","Data":"9abf64e8e8c37ce294c3ddb9a8ae23caea619653421de6a9bd59acca8529764f"} Jan 30 21:28:17 crc kubenswrapper[4914]: I0130 21:28:17.825930 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77a21683-69d1-4459-aa95-cf4f0d33ec19" path="/var/lib/kubelet/pods/77a21683-69d1-4459-aa95-cf4f0d33ec19/volumes" Jan 30 21:28:19 crc kubenswrapper[4914]: I0130 21:28:19.498504 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" event={"ID":"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5","Type":"ContainerStarted","Data":"9d023159ae6da871a8e47f768706b772bbec1794fb0347a4eece907984f5fb09"} Jan 30 21:28:20 crc kubenswrapper[4914]: I0130 21:28:20.508886 4914 generic.go:334] "Generic (PLEG): container finished" podID="c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5" containerID="9d023159ae6da871a8e47f768706b772bbec1794fb0347a4eece907984f5fb09" exitCode=0 Jan 30 21:28:20 crc kubenswrapper[4914]: I0130 21:28:20.508962 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" event={"ID":"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5","Type":"ContainerDied","Data":"9d023159ae6da871a8e47f768706b772bbec1794fb0347a4eece907984f5fb09"} Jan 30 21:28:21 crc kubenswrapper[4914]: I0130 21:28:21.524169 4914 generic.go:334] "Generic (PLEG): container finished" podID="c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5" containerID="092bb712b5a3f6d0398707b093ed1c1099a9f92c664d89c114bd43f81418c32d" exitCode=0 Jan 30 21:28:21 crc kubenswrapper[4914]: I0130 21:28:21.524464 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" event={"ID":"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5","Type":"ContainerDied","Data":"092bb712b5a3f6d0398707b093ed1c1099a9f92c664d89c114bd43f81418c32d"} Jan 30 21:28:22 crc kubenswrapper[4914]: I0130 21:28:22.875402 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" Jan 30 21:28:23 crc kubenswrapper[4914]: I0130 21:28:23.009586 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-bundle\") pod \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\" (UID: \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\") " Jan 30 21:28:23 crc kubenswrapper[4914]: I0130 21:28:23.009845 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-util\") pod \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\" (UID: \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\") " Jan 30 21:28:23 crc kubenswrapper[4914]: I0130 21:28:23.009921 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r558\" (UniqueName: \"kubernetes.io/projected/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-kube-api-access-4r558\") pod \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\" (UID: \"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5\") " Jan 30 21:28:23 crc kubenswrapper[4914]: I0130 21:28:23.011177 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-bundle" (OuterVolumeSpecName: "bundle") pod "c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5" (UID: "c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:28:23 crc kubenswrapper[4914]: I0130 21:28:23.018174 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-kube-api-access-4r558" (OuterVolumeSpecName: "kube-api-access-4r558") pod "c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5" (UID: "c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5"). InnerVolumeSpecName "kube-api-access-4r558". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:28:23 crc kubenswrapper[4914]: I0130 21:28:23.042993 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-util" (OuterVolumeSpecName: "util") pod "c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5" (UID: "c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:28:23 crc kubenswrapper[4914]: I0130 21:28:23.112305 4914 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:23 crc kubenswrapper[4914]: I0130 21:28:23.112680 4914 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-util\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:23 crc kubenswrapper[4914]: I0130 21:28:23.112701 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r558\" (UniqueName: \"kubernetes.io/projected/c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5-kube-api-access-4r558\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:23 crc kubenswrapper[4914]: I0130 21:28:23.541301 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" event={"ID":"c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5","Type":"ContainerDied","Data":"9abf64e8e8c37ce294c3ddb9a8ae23caea619653421de6a9bd59acca8529764f"} Jan 30 21:28:23 crc kubenswrapper[4914]: I0130 21:28:23.541377 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9abf64e8e8c37ce294c3ddb9a8ae23caea619653421de6a9bd59acca8529764f" Jan 30 21:28:23 crc kubenswrapper[4914]: I0130 21:28:23.541418 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm" Jan 30 21:28:26 crc kubenswrapper[4914]: I0130 21:28:26.982959 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:28:26 crc kubenswrapper[4914]: I0130 21:28:26.983316 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:28:26 crc kubenswrapper[4914]: I0130 21:28:26.983376 4914 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:28:26 crc kubenswrapper[4914]: I0130 21:28:26.984204 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e121058e768dda1d14fe4563b4b94e4252170909803ddfd6651100686fef20ef"} pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:28:26 crc kubenswrapper[4914]: I0130 21:28:26.984307 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" containerID="cri-o://e121058e768dda1d14fe4563b4b94e4252170909803ddfd6651100686fef20ef" gracePeriod=600 Jan 30 21:28:27 crc kubenswrapper[4914]: I0130 21:28:27.571055 4914 generic.go:334] "Generic (PLEG): container finished" podID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerID="e121058e768dda1d14fe4563b4b94e4252170909803ddfd6651100686fef20ef" exitCode=0 Jan 30 21:28:27 crc kubenswrapper[4914]: I0130 21:28:27.571095 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerDied","Data":"e121058e768dda1d14fe4563b4b94e4252170909803ddfd6651100686fef20ef"} Jan 30 21:28:27 crc kubenswrapper[4914]: I0130 21:28:27.571497 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"af122f4ba69a9f285a7275f9a58f9bcc4666b137ea591150601d02ec4dc641e5"} Jan 30 21:28:27 crc kubenswrapper[4914]: I0130 21:28:27.571525 4914 scope.go:117] "RemoveContainer" containerID="24c5be9264259bf70fbe610f05edf4820e483959d98c60593634eeec5ed85321" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.079221 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m"] Jan 30 21:28:31 crc kubenswrapper[4914]: E0130 21:28:31.080905 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5" containerName="extract" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.081011 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5" containerName="extract" Jan 30 21:28:31 crc kubenswrapper[4914]: E0130 21:28:31.081101 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5" containerName="util" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.081269 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5" containerName="util" Jan 30 21:28:31 crc kubenswrapper[4914]: E0130 21:28:31.081351 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5" containerName="pull" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.081423 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5" containerName="pull" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.081632 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5" containerName="extract" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.082268 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.086657 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.086754 4914 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.086911 4914 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.086925 4914 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-hjqm2" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.087100 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.105616 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m"] Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.130040 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a34acb77-7da2-4edf-b829-d8b8ce25657e-webhook-cert\") pod \"metallb-operator-controller-manager-674bc87d47-gkh5m\" (UID: \"a34acb77-7da2-4edf-b829-d8b8ce25657e\") " pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.130398 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a34acb77-7da2-4edf-b829-d8b8ce25657e-apiservice-cert\") pod \"metallb-operator-controller-manager-674bc87d47-gkh5m\" (UID: \"a34acb77-7da2-4edf-b829-d8b8ce25657e\") " pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.130620 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8ghp\" (UniqueName: \"kubernetes.io/projected/a34acb77-7da2-4edf-b829-d8b8ce25657e-kube-api-access-g8ghp\") pod \"metallb-operator-controller-manager-674bc87d47-gkh5m\" (UID: \"a34acb77-7da2-4edf-b829-d8b8ce25657e\") " pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.231804 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a34acb77-7da2-4edf-b829-d8b8ce25657e-webhook-cert\") pod \"metallb-operator-controller-manager-674bc87d47-gkh5m\" (UID: \"a34acb77-7da2-4edf-b829-d8b8ce25657e\") " pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.232156 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a34acb77-7da2-4edf-b829-d8b8ce25657e-apiservice-cert\") pod \"metallb-operator-controller-manager-674bc87d47-gkh5m\" (UID: \"a34acb77-7da2-4edf-b829-d8b8ce25657e\") " pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.232217 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8ghp\" (UniqueName: \"kubernetes.io/projected/a34acb77-7da2-4edf-b829-d8b8ce25657e-kube-api-access-g8ghp\") pod \"metallb-operator-controller-manager-674bc87d47-gkh5m\" (UID: \"a34acb77-7da2-4edf-b829-d8b8ce25657e\") " pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.238424 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a34acb77-7da2-4edf-b829-d8b8ce25657e-apiservice-cert\") pod \"metallb-operator-controller-manager-674bc87d47-gkh5m\" (UID: \"a34acb77-7da2-4edf-b829-d8b8ce25657e\") " pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.238425 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a34acb77-7da2-4edf-b829-d8b8ce25657e-webhook-cert\") pod \"metallb-operator-controller-manager-674bc87d47-gkh5m\" (UID: \"a34acb77-7da2-4edf-b829-d8b8ce25657e\") " pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.246826 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8ghp\" (UniqueName: \"kubernetes.io/projected/a34acb77-7da2-4edf-b829-d8b8ce25657e-kube-api-access-g8ghp\") pod \"metallb-operator-controller-manager-674bc87d47-gkh5m\" (UID: \"a34acb77-7da2-4edf-b829-d8b8ce25657e\") " pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.401622 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.513929 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn"] Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.515002 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.518053 4914 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-86pkd" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.518195 4914 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.530356 4914 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.535232 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn"] Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.636413 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/900c747e-7f1b-43d2-ae93-ea5e9891dcf1-webhook-cert\") pod \"metallb-operator-webhook-server-55c6c4f998-z99nn\" (UID: \"900c747e-7f1b-43d2-ae93-ea5e9891dcf1\") " pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.636504 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/900c747e-7f1b-43d2-ae93-ea5e9891dcf1-apiservice-cert\") pod \"metallb-operator-webhook-server-55c6c4f998-z99nn\" (UID: \"900c747e-7f1b-43d2-ae93-ea5e9891dcf1\") " pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.636671 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx996\" (UniqueName: \"kubernetes.io/projected/900c747e-7f1b-43d2-ae93-ea5e9891dcf1-kube-api-access-lx996\") pod \"metallb-operator-webhook-server-55c6c4f998-z99nn\" (UID: \"900c747e-7f1b-43d2-ae93-ea5e9891dcf1\") " pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.665799 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m"] Jan 30 21:28:31 crc kubenswrapper[4914]: W0130 21:28:31.677023 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda34acb77_7da2_4edf_b829_d8b8ce25657e.slice/crio-b1a91b374321e4a931310a1e5b7d1fa71ca01696ee2d3de2cf6eb192b892108c WatchSource:0}: Error finding container b1a91b374321e4a931310a1e5b7d1fa71ca01696ee2d3de2cf6eb192b892108c: Status 404 returned error can't find the container with id b1a91b374321e4a931310a1e5b7d1fa71ca01696ee2d3de2cf6eb192b892108c Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.737460 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/900c747e-7f1b-43d2-ae93-ea5e9891dcf1-apiservice-cert\") pod \"metallb-operator-webhook-server-55c6c4f998-z99nn\" (UID: \"900c747e-7f1b-43d2-ae93-ea5e9891dcf1\") " pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.737561 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx996\" (UniqueName: \"kubernetes.io/projected/900c747e-7f1b-43d2-ae93-ea5e9891dcf1-kube-api-access-lx996\") pod \"metallb-operator-webhook-server-55c6c4f998-z99nn\" (UID: \"900c747e-7f1b-43d2-ae93-ea5e9891dcf1\") " pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.737604 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/900c747e-7f1b-43d2-ae93-ea5e9891dcf1-webhook-cert\") pod \"metallb-operator-webhook-server-55c6c4f998-z99nn\" (UID: \"900c747e-7f1b-43d2-ae93-ea5e9891dcf1\") " pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.744143 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/900c747e-7f1b-43d2-ae93-ea5e9891dcf1-webhook-cert\") pod \"metallb-operator-webhook-server-55c6c4f998-z99nn\" (UID: \"900c747e-7f1b-43d2-ae93-ea5e9891dcf1\") " pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.747271 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/900c747e-7f1b-43d2-ae93-ea5e9891dcf1-apiservice-cert\") pod \"metallb-operator-webhook-server-55c6c4f998-z99nn\" (UID: \"900c747e-7f1b-43d2-ae93-ea5e9891dcf1\") " pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.765408 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx996\" (UniqueName: \"kubernetes.io/projected/900c747e-7f1b-43d2-ae93-ea5e9891dcf1-kube-api-access-lx996\") pod \"metallb-operator-webhook-server-55c6c4f998-z99nn\" (UID: \"900c747e-7f1b-43d2-ae93-ea5e9891dcf1\") " pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" Jan 30 21:28:31 crc kubenswrapper[4914]: I0130 21:28:31.841050 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" Jan 30 21:28:32 crc kubenswrapper[4914]: I0130 21:28:32.289027 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn"] Jan 30 21:28:32 crc kubenswrapper[4914]: W0130 21:28:32.298099 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod900c747e_7f1b_43d2_ae93_ea5e9891dcf1.slice/crio-f7beab4a90e91dfeedde0b696a4eb828f7bdb546bed7674feb5ded97e2547158 WatchSource:0}: Error finding container f7beab4a90e91dfeedde0b696a4eb828f7bdb546bed7674feb5ded97e2547158: Status 404 returned error can't find the container with id f7beab4a90e91dfeedde0b696a4eb828f7bdb546bed7674feb5ded97e2547158 Jan 30 21:28:32 crc kubenswrapper[4914]: I0130 21:28:32.616698 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" event={"ID":"900c747e-7f1b-43d2-ae93-ea5e9891dcf1","Type":"ContainerStarted","Data":"f7beab4a90e91dfeedde0b696a4eb828f7bdb546bed7674feb5ded97e2547158"} Jan 30 21:28:32 crc kubenswrapper[4914]: I0130 21:28:32.618471 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" event={"ID":"a34acb77-7da2-4edf-b829-d8b8ce25657e","Type":"ContainerStarted","Data":"b1a91b374321e4a931310a1e5b7d1fa71ca01696ee2d3de2cf6eb192b892108c"} Jan 30 21:28:35 crc kubenswrapper[4914]: I0130 21:28:35.643923 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" event={"ID":"a34acb77-7da2-4edf-b829-d8b8ce25657e","Type":"ContainerStarted","Data":"f1098dcc964304729a54b8ef6991178e8e18e4853e732d8f0f23d736b29ff4f0"} Jan 30 21:28:35 crc kubenswrapper[4914]: I0130 21:28:35.644683 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" Jan 30 21:28:35 crc kubenswrapper[4914]: I0130 21:28:35.681064 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" podStartSLOduration=1.235663566 podStartE2EDuration="4.681040067s" podCreationTimestamp="2026-01-30 21:28:31 +0000 UTC" firstStartedPulling="2026-01-30 21:28:31.678673025 +0000 UTC m=+845.117309796" lastFinishedPulling="2026-01-30 21:28:35.124049536 +0000 UTC m=+848.562686297" observedRunningTime="2026-01-30 21:28:35.672595364 +0000 UTC m=+849.111232125" watchObservedRunningTime="2026-01-30 21:28:35.681040067 +0000 UTC m=+849.119676828" Jan 30 21:28:37 crc kubenswrapper[4914]: I0130 21:28:37.657484 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" event={"ID":"900c747e-7f1b-43d2-ae93-ea5e9891dcf1","Type":"ContainerStarted","Data":"f048d8ad9155c23641385eb9741e7e41683e3800d015978bda0933cd91d77ba8"} Jan 30 21:28:37 crc kubenswrapper[4914]: I0130 21:28:37.658072 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" Jan 30 21:28:37 crc kubenswrapper[4914]: I0130 21:28:37.681235 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" podStartSLOduration=2.259509295 podStartE2EDuration="6.681217394s" podCreationTimestamp="2026-01-30 21:28:31 +0000 UTC" firstStartedPulling="2026-01-30 21:28:32.302520441 +0000 UTC m=+845.741157202" lastFinishedPulling="2026-01-30 21:28:36.72422854 +0000 UTC m=+850.162865301" observedRunningTime="2026-01-30 21:28:37.678645642 +0000 UTC m=+851.117282423" watchObservedRunningTime="2026-01-30 21:28:37.681217394 +0000 UTC m=+851.119854165" Jan 30 21:28:51 crc kubenswrapper[4914]: I0130 21:28:51.845088 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-55c6c4f998-z99nn" Jan 30 21:29:11 crc kubenswrapper[4914]: I0130 21:29:11.406048 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.223459 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-srzwt"] Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.226443 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.234537 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9"] Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.235574 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.243660 4914 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.244104 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.244135 4914 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-tnlqz" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.244604 4914 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.254982 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9"] Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.288419 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-metrics\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.288473 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-frr-conf\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.288517 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18da6d54-2a6a-4109-9927-194cc41ef5f5-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-8tvn9\" (UID: \"18da6d54-2a6a-4109-9927-194cc41ef5f5\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.288538 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-reloader\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.288575 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-frr-sockets\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.288622 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-metrics-certs\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.288645 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-frr-startup\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.288676 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc46d\" (UniqueName: \"kubernetes.io/projected/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-kube-api-access-rc46d\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.288723 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cjwr\" (UniqueName: \"kubernetes.io/projected/18da6d54-2a6a-4109-9927-194cc41ef5f5-kube-api-access-5cjwr\") pod \"frr-k8s-webhook-server-7df86c4f6c-8tvn9\" (UID: \"18da6d54-2a6a-4109-9927-194cc41ef5f5\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.344001 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-dwwzf"] Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.345167 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dwwzf" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.348881 4914 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.349203 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.349303 4914 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-rnxcj" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.349316 4914 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.385184 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-8624b"] Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.386587 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-8624b" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.390073 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/165fe06d-18fe-40a3-b24d-6093aea89a4e-metallb-excludel2\") pod \"speaker-dwwzf\" (UID: \"165fe06d-18fe-40a3-b24d-6093aea89a4e\") " pod="metallb-system/speaker-dwwzf" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.390119 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-metrics\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.390146 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-frr-conf\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.390204 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18da6d54-2a6a-4109-9927-194cc41ef5f5-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-8tvn9\" (UID: \"18da6d54-2a6a-4109-9927-194cc41ef5f5\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.390235 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-reloader\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.390272 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-frr-sockets\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.390319 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/165fe06d-18fe-40a3-b24d-6093aea89a4e-memberlist\") pod \"speaker-dwwzf\" (UID: \"165fe06d-18fe-40a3-b24d-6093aea89a4e\") " pod="metallb-system/speaker-dwwzf" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.390348 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-metrics-certs\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.390376 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-frr-startup\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.390402 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf6qm\" (UniqueName: \"kubernetes.io/projected/165fe06d-18fe-40a3-b24d-6093aea89a4e-kube-api-access-rf6qm\") pod \"speaker-dwwzf\" (UID: \"165fe06d-18fe-40a3-b24d-6093aea89a4e\") " pod="metallb-system/speaker-dwwzf" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.390431 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc46d\" (UniqueName: \"kubernetes.io/projected/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-kube-api-access-rc46d\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.390456 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cjwr\" (UniqueName: \"kubernetes.io/projected/18da6d54-2a6a-4109-9927-194cc41ef5f5-kube-api-access-5cjwr\") pod \"frr-k8s-webhook-server-7df86c4f6c-8tvn9\" (UID: \"18da6d54-2a6a-4109-9927-194cc41ef5f5\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.390490 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/165fe06d-18fe-40a3-b24d-6093aea89a4e-metrics-certs\") pod \"speaker-dwwzf\" (UID: \"165fe06d-18fe-40a3-b24d-6093aea89a4e\") " pod="metallb-system/speaker-dwwzf" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.390821 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-frr-conf\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.391211 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-metrics\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.392063 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-frr-sockets\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.392198 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-frr-startup\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.392623 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-reloader\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.393790 4914 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.433463 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-metrics-certs\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.442833 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cjwr\" (UniqueName: \"kubernetes.io/projected/18da6d54-2a6a-4109-9927-194cc41ef5f5-kube-api-access-5cjwr\") pod \"frr-k8s-webhook-server-7df86c4f6c-8tvn9\" (UID: \"18da6d54-2a6a-4109-9927-194cc41ef5f5\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.444047 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18da6d54-2a6a-4109-9927-194cc41ef5f5-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-8tvn9\" (UID: \"18da6d54-2a6a-4109-9927-194cc41ef5f5\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.482594 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc46d\" (UniqueName: \"kubernetes.io/projected/aaa76eb6-e780-499d-8fb2-8eeb68cdbae8-kube-api-access-rc46d\") pod \"frr-k8s-srzwt\" (UID: \"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8\") " pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.482714 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-8624b"] Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.493168 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/165fe06d-18fe-40a3-b24d-6093aea89a4e-memberlist\") pod \"speaker-dwwzf\" (UID: \"165fe06d-18fe-40a3-b24d-6093aea89a4e\") " pod="metallb-system/speaker-dwwzf" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.493227 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf6qm\" (UniqueName: \"kubernetes.io/projected/165fe06d-18fe-40a3-b24d-6093aea89a4e-kube-api-access-rf6qm\") pod \"speaker-dwwzf\" (UID: \"165fe06d-18fe-40a3-b24d-6093aea89a4e\") " pod="metallb-system/speaker-dwwzf" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.493280 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f282ec06-f396-424a-9fba-54a7a74b1831-metrics-certs\") pod \"controller-6968d8fdc4-8624b\" (UID: \"f282ec06-f396-424a-9fba-54a7a74b1831\") " pod="metallb-system/controller-6968d8fdc4-8624b" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.493312 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42dlg\" (UniqueName: \"kubernetes.io/projected/f282ec06-f396-424a-9fba-54a7a74b1831-kube-api-access-42dlg\") pod \"controller-6968d8fdc4-8624b\" (UID: \"f282ec06-f396-424a-9fba-54a7a74b1831\") " pod="metallb-system/controller-6968d8fdc4-8624b" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.493349 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/165fe06d-18fe-40a3-b24d-6093aea89a4e-metrics-certs\") pod \"speaker-dwwzf\" (UID: \"165fe06d-18fe-40a3-b24d-6093aea89a4e\") " pod="metallb-system/speaker-dwwzf" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.493382 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f282ec06-f396-424a-9fba-54a7a74b1831-cert\") pod \"controller-6968d8fdc4-8624b\" (UID: \"f282ec06-f396-424a-9fba-54a7a74b1831\") " pod="metallb-system/controller-6968d8fdc4-8624b" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.493416 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/165fe06d-18fe-40a3-b24d-6093aea89a4e-metallb-excludel2\") pod \"speaker-dwwzf\" (UID: \"165fe06d-18fe-40a3-b24d-6093aea89a4e\") " pod="metallb-system/speaker-dwwzf" Jan 30 21:29:12 crc kubenswrapper[4914]: E0130 21:29:12.494269 4914 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 30 21:29:12 crc kubenswrapper[4914]: E0130 21:29:12.494336 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/165fe06d-18fe-40a3-b24d-6093aea89a4e-memberlist podName:165fe06d-18fe-40a3-b24d-6093aea89a4e nodeName:}" failed. No retries permitted until 2026-01-30 21:29:12.994320777 +0000 UTC m=+886.432957528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/165fe06d-18fe-40a3-b24d-6093aea89a4e-memberlist") pod "speaker-dwwzf" (UID: "165fe06d-18fe-40a3-b24d-6093aea89a4e") : secret "metallb-memberlist" not found Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.494361 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/165fe06d-18fe-40a3-b24d-6093aea89a4e-metallb-excludel2\") pod \"speaker-dwwzf\" (UID: \"165fe06d-18fe-40a3-b24d-6093aea89a4e\") " pod="metallb-system/speaker-dwwzf" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.499297 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/165fe06d-18fe-40a3-b24d-6093aea89a4e-metrics-certs\") pod \"speaker-dwwzf\" (UID: \"165fe06d-18fe-40a3-b24d-6093aea89a4e\") " pod="metallb-system/speaker-dwwzf" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.539256 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf6qm\" (UniqueName: \"kubernetes.io/projected/165fe06d-18fe-40a3-b24d-6093aea89a4e-kube-api-access-rf6qm\") pod \"speaker-dwwzf\" (UID: \"165fe06d-18fe-40a3-b24d-6093aea89a4e\") " pod="metallb-system/speaker-dwwzf" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.547996 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.554939 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.594488 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42dlg\" (UniqueName: \"kubernetes.io/projected/f282ec06-f396-424a-9fba-54a7a74b1831-kube-api-access-42dlg\") pod \"controller-6968d8fdc4-8624b\" (UID: \"f282ec06-f396-424a-9fba-54a7a74b1831\") " pod="metallb-system/controller-6968d8fdc4-8624b" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.594550 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f282ec06-f396-424a-9fba-54a7a74b1831-cert\") pod \"controller-6968d8fdc4-8624b\" (UID: \"f282ec06-f396-424a-9fba-54a7a74b1831\") " pod="metallb-system/controller-6968d8fdc4-8624b" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.594653 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f282ec06-f396-424a-9fba-54a7a74b1831-metrics-certs\") pod \"controller-6968d8fdc4-8624b\" (UID: \"f282ec06-f396-424a-9fba-54a7a74b1831\") " pod="metallb-system/controller-6968d8fdc4-8624b" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.598066 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f282ec06-f396-424a-9fba-54a7a74b1831-metrics-certs\") pod \"controller-6968d8fdc4-8624b\" (UID: \"f282ec06-f396-424a-9fba-54a7a74b1831\") " pod="metallb-system/controller-6968d8fdc4-8624b" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.598078 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f282ec06-f396-424a-9fba-54a7a74b1831-cert\") pod \"controller-6968d8fdc4-8624b\" (UID: \"f282ec06-f396-424a-9fba-54a7a74b1831\") " pod="metallb-system/controller-6968d8fdc4-8624b" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.612802 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42dlg\" (UniqueName: \"kubernetes.io/projected/f282ec06-f396-424a-9fba-54a7a74b1831-kube-api-access-42dlg\") pod \"controller-6968d8fdc4-8624b\" (UID: \"f282ec06-f396-424a-9fba-54a7a74b1831\") " pod="metallb-system/controller-6968d8fdc4-8624b" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.702355 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-8624b" Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.890589 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-srzwt" event={"ID":"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8","Type":"ContainerStarted","Data":"e783e0a627e82d2ab7e2a3c8244f1006b7e7ec2404e221be6900aa576c99bae9"} Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.899374 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-8624b"] Jan 30 21:29:12 crc kubenswrapper[4914]: W0130 21:29:12.900491 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf282ec06_f396_424a_9fba_54a7a74b1831.slice/crio-7e80151c0b159091d636ee9e907b706851a9bc2fe360bc974ca1ed9aa9e45742 WatchSource:0}: Error finding container 7e80151c0b159091d636ee9e907b706851a9bc2fe360bc974ca1ed9aa9e45742: Status 404 returned error can't find the container with id 7e80151c0b159091d636ee9e907b706851a9bc2fe360bc974ca1ed9aa9e45742 Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.992176 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9"] Jan 30 21:29:12 crc kubenswrapper[4914]: W0130 21:29:12.996921 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18da6d54_2a6a_4109_9927_194cc41ef5f5.slice/crio-34401a9068fd721dcd8fc78dc8883cf87bd1bbd62034a529f5708158d0383453 WatchSource:0}: Error finding container 34401a9068fd721dcd8fc78dc8883cf87bd1bbd62034a529f5708158d0383453: Status 404 returned error can't find the container with id 34401a9068fd721dcd8fc78dc8883cf87bd1bbd62034a529f5708158d0383453 Jan 30 21:29:12 crc kubenswrapper[4914]: I0130 21:29:12.999816 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/165fe06d-18fe-40a3-b24d-6093aea89a4e-memberlist\") pod \"speaker-dwwzf\" (UID: \"165fe06d-18fe-40a3-b24d-6093aea89a4e\") " pod="metallb-system/speaker-dwwzf" Jan 30 21:29:12 crc kubenswrapper[4914]: E0130 21:29:12.999904 4914 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 30 21:29:12 crc kubenswrapper[4914]: E0130 21:29:12.999955 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/165fe06d-18fe-40a3-b24d-6093aea89a4e-memberlist podName:165fe06d-18fe-40a3-b24d-6093aea89a4e nodeName:}" failed. No retries permitted until 2026-01-30 21:29:13.999941906 +0000 UTC m=+887.438578667 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/165fe06d-18fe-40a3-b24d-6093aea89a4e-memberlist") pod "speaker-dwwzf" (UID: "165fe06d-18fe-40a3-b24d-6093aea89a4e") : secret "metallb-memberlist" not found Jan 30 21:29:13 crc kubenswrapper[4914]: I0130 21:29:13.899484 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-8624b" event={"ID":"f282ec06-f396-424a-9fba-54a7a74b1831","Type":"ContainerStarted","Data":"8fbcffdd17f0f6cdfbaf736328d920c1559def646b799a20af973217b3ac11ce"} Jan 30 21:29:13 crc kubenswrapper[4914]: I0130 21:29:13.899989 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-8624b" Jan 30 21:29:13 crc kubenswrapper[4914]: I0130 21:29:13.900029 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-8624b" event={"ID":"f282ec06-f396-424a-9fba-54a7a74b1831","Type":"ContainerStarted","Data":"91c45875db7ee69c0a89ce6d34711c0a3d9d804ee57fe97202ceeca458e98797"} Jan 30 21:29:13 crc kubenswrapper[4914]: I0130 21:29:13.900044 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-8624b" event={"ID":"f282ec06-f396-424a-9fba-54a7a74b1831","Type":"ContainerStarted","Data":"7e80151c0b159091d636ee9e907b706851a9bc2fe360bc974ca1ed9aa9e45742"} Jan 30 21:29:13 crc kubenswrapper[4914]: I0130 21:29:13.900343 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9" event={"ID":"18da6d54-2a6a-4109-9927-194cc41ef5f5","Type":"ContainerStarted","Data":"34401a9068fd721dcd8fc78dc8883cf87bd1bbd62034a529f5708158d0383453"} Jan 30 21:29:13 crc kubenswrapper[4914]: I0130 21:29:13.920366 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-8624b" podStartSLOduration=1.920342662 podStartE2EDuration="1.920342662s" podCreationTimestamp="2026-01-30 21:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:29:13.916152531 +0000 UTC m=+887.354789292" watchObservedRunningTime="2026-01-30 21:29:13.920342662 +0000 UTC m=+887.358979423" Jan 30 21:29:14 crc kubenswrapper[4914]: I0130 21:29:14.015093 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/165fe06d-18fe-40a3-b24d-6093aea89a4e-memberlist\") pod \"speaker-dwwzf\" (UID: \"165fe06d-18fe-40a3-b24d-6093aea89a4e\") " pod="metallb-system/speaker-dwwzf" Jan 30 21:29:14 crc kubenswrapper[4914]: I0130 21:29:14.027239 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/165fe06d-18fe-40a3-b24d-6093aea89a4e-memberlist\") pod \"speaker-dwwzf\" (UID: \"165fe06d-18fe-40a3-b24d-6093aea89a4e\") " pod="metallb-system/speaker-dwwzf" Jan 30 21:29:14 crc kubenswrapper[4914]: I0130 21:29:14.158616 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dwwzf" Jan 30 21:29:14 crc kubenswrapper[4914]: W0130 21:29:14.182291 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod165fe06d_18fe_40a3_b24d_6093aea89a4e.slice/crio-f040b99f1a40b7e3dc434a95704c492c68ddb1fb16a6b099b37e027e78f73d10 WatchSource:0}: Error finding container f040b99f1a40b7e3dc434a95704c492c68ddb1fb16a6b099b37e027e78f73d10: Status 404 returned error can't find the container with id f040b99f1a40b7e3dc434a95704c492c68ddb1fb16a6b099b37e027e78f73d10 Jan 30 21:29:14 crc kubenswrapper[4914]: I0130 21:29:14.908802 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dwwzf" event={"ID":"165fe06d-18fe-40a3-b24d-6093aea89a4e","Type":"ContainerStarted","Data":"be53ddcd2494c434f38b22d3058f87288a246c49675f2726b422b4b30ea22f4e"} Jan 30 21:29:14 crc kubenswrapper[4914]: I0130 21:29:14.909143 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dwwzf" event={"ID":"165fe06d-18fe-40a3-b24d-6093aea89a4e","Type":"ContainerStarted","Data":"24b73de6f19f21b2e53e5c5904cd3f8187ea798ec42f82a93c39497d10462897"} Jan 30 21:29:14 crc kubenswrapper[4914]: I0130 21:29:14.909159 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dwwzf" event={"ID":"165fe06d-18fe-40a3-b24d-6093aea89a4e","Type":"ContainerStarted","Data":"f040b99f1a40b7e3dc434a95704c492c68ddb1fb16a6b099b37e027e78f73d10"} Jan 30 21:29:14 crc kubenswrapper[4914]: I0130 21:29:14.909305 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-dwwzf" Jan 30 21:29:14 crc kubenswrapper[4914]: I0130 21:29:14.962581 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-dwwzf" podStartSLOduration=2.962563022 podStartE2EDuration="2.962563022s" podCreationTimestamp="2026-01-30 21:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:29:14.955585844 +0000 UTC m=+888.394222605" watchObservedRunningTime="2026-01-30 21:29:14.962563022 +0000 UTC m=+888.401199783" Jan 30 21:29:21 crc kubenswrapper[4914]: I0130 21:29:21.956453 4914 generic.go:334] "Generic (PLEG): container finished" podID="aaa76eb6-e780-499d-8fb2-8eeb68cdbae8" containerID="0fe516038d834cab89007c481be86c4b163867ce305f4a3b6b6083cd37f738ee" exitCode=0 Jan 30 21:29:21 crc kubenswrapper[4914]: I0130 21:29:21.956547 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-srzwt" event={"ID":"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8","Type":"ContainerDied","Data":"0fe516038d834cab89007c481be86c4b163867ce305f4a3b6b6083cd37f738ee"} Jan 30 21:29:21 crc kubenswrapper[4914]: I0130 21:29:21.959056 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9" event={"ID":"18da6d54-2a6a-4109-9927-194cc41ef5f5","Type":"ContainerStarted","Data":"ad096a9a4697bdd529a2cafb06217e9e75c797f1d1a7d81ebc7e8638b36cefee"} Jan 30 21:29:21 crc kubenswrapper[4914]: I0130 21:29:21.959223 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9" Jan 30 21:29:21 crc kubenswrapper[4914]: I0130 21:29:21.991610 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9" podStartSLOduration=2.068064938 podStartE2EDuration="9.991592823s" podCreationTimestamp="2026-01-30 21:29:12 +0000 UTC" firstStartedPulling="2026-01-30 21:29:13.000807627 +0000 UTC m=+886.439444408" lastFinishedPulling="2026-01-30 21:29:20.924335532 +0000 UTC m=+894.362972293" observedRunningTime="2026-01-30 21:29:21.986847209 +0000 UTC m=+895.425483970" watchObservedRunningTime="2026-01-30 21:29:21.991592823 +0000 UTC m=+895.430229584" Jan 30 21:29:22 crc kubenswrapper[4914]: I0130 21:29:22.965462 4914 generic.go:334] "Generic (PLEG): container finished" podID="aaa76eb6-e780-499d-8fb2-8eeb68cdbae8" containerID="0351b5e8cee2d09d1f5bc7f89f45bc155362aa5328e75c52dd20ffa2d68a37ae" exitCode=0 Jan 30 21:29:22 crc kubenswrapper[4914]: I0130 21:29:22.965642 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-srzwt" event={"ID":"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8","Type":"ContainerDied","Data":"0351b5e8cee2d09d1f5bc7f89f45bc155362aa5328e75c52dd20ffa2d68a37ae"} Jan 30 21:29:23 crc kubenswrapper[4914]: I0130 21:29:23.974830 4914 generic.go:334] "Generic (PLEG): container finished" podID="aaa76eb6-e780-499d-8fb2-8eeb68cdbae8" containerID="140d55958deacf95b83a770812ba650ad96264c41039c2b1bedb3a2b18e0677b" exitCode=0 Jan 30 21:29:23 crc kubenswrapper[4914]: I0130 21:29:23.974896 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-srzwt" event={"ID":"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8","Type":"ContainerDied","Data":"140d55958deacf95b83a770812ba650ad96264c41039c2b1bedb3a2b18e0677b"} Jan 30 21:29:24 crc kubenswrapper[4914]: I0130 21:29:24.162312 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-dwwzf" Jan 30 21:29:24 crc kubenswrapper[4914]: I0130 21:29:24.988060 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-srzwt" event={"ID":"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8","Type":"ContainerStarted","Data":"4586b8dc96b0c5eaef9c1ab80f7e2a5b5b7a7576da9ec5221273327160141272"} Jan 30 21:29:24 crc kubenswrapper[4914]: I0130 21:29:24.988424 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-srzwt" event={"ID":"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8","Type":"ContainerStarted","Data":"99fe7b0886f490f83531c663893f20be88df0b60a848181d77b8da39ca45b74c"} Jan 30 21:29:26 crc kubenswrapper[4914]: I0130 21:29:26.002943 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-srzwt" event={"ID":"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8","Type":"ContainerStarted","Data":"7364f7e388d6de957d00bb3dd5d26b1aef32179c2c489a150c58f46b722f8df7"} Jan 30 21:29:26 crc kubenswrapper[4914]: I0130 21:29:26.002999 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-srzwt" event={"ID":"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8","Type":"ContainerStarted","Data":"0885b3601765472a443690e2a2041fdafa60b0260ac2b4cf380c7ccdb735cb4c"} Jan 30 21:29:26 crc kubenswrapper[4914]: I0130 21:29:26.003015 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-srzwt" event={"ID":"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8","Type":"ContainerStarted","Data":"38c705d35676c43ab6ef4305955180485589937de7b6d06d747ed0614957e269"} Jan 30 21:29:26 crc kubenswrapper[4914]: I0130 21:29:26.003029 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-srzwt" event={"ID":"aaa76eb6-e780-499d-8fb2-8eeb68cdbae8","Type":"ContainerStarted","Data":"acab4b479103492ead515e746310aaf5827a0170c73c5a32369c475a300c8836"} Jan 30 21:29:26 crc kubenswrapper[4914]: I0130 21:29:26.003194 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:26 crc kubenswrapper[4914]: I0130 21:29:26.036313 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-srzwt" podStartSLOduration=5.753714336 podStartE2EDuration="14.036287891s" podCreationTimestamp="2026-01-30 21:29:12 +0000 UTC" firstStartedPulling="2026-01-30 21:29:12.673924029 +0000 UTC m=+886.112560790" lastFinishedPulling="2026-01-30 21:29:20.956497584 +0000 UTC m=+894.395134345" observedRunningTime="2026-01-30 21:29:26.030976593 +0000 UTC m=+899.469613354" watchObservedRunningTime="2026-01-30 21:29:26.036287891 +0000 UTC m=+899.474924692" Jan 30 21:29:27 crc kubenswrapper[4914]: I0130 21:29:27.172300 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mqwrr"] Jan 30 21:29:27 crc kubenswrapper[4914]: I0130 21:29:27.173238 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mqwrr" Jan 30 21:29:27 crc kubenswrapper[4914]: I0130 21:29:27.175862 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-jf69p" Jan 30 21:29:27 crc kubenswrapper[4914]: I0130 21:29:27.176287 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 30 21:29:27 crc kubenswrapper[4914]: I0130 21:29:27.177487 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 30 21:29:27 crc kubenswrapper[4914]: I0130 21:29:27.188936 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mqwrr"] Jan 30 21:29:27 crc kubenswrapper[4914]: I0130 21:29:27.207004 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqs7s\" (UniqueName: \"kubernetes.io/projected/5fe29cd0-120d-40b6-b9eb-48da17fd1049-kube-api-access-sqs7s\") pod \"openstack-operator-index-mqwrr\" (UID: \"5fe29cd0-120d-40b6-b9eb-48da17fd1049\") " pod="openstack-operators/openstack-operator-index-mqwrr" Jan 30 21:29:27 crc kubenswrapper[4914]: I0130 21:29:27.308161 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqs7s\" (UniqueName: \"kubernetes.io/projected/5fe29cd0-120d-40b6-b9eb-48da17fd1049-kube-api-access-sqs7s\") pod \"openstack-operator-index-mqwrr\" (UID: \"5fe29cd0-120d-40b6-b9eb-48da17fd1049\") " pod="openstack-operators/openstack-operator-index-mqwrr" Jan 30 21:29:27 crc kubenswrapper[4914]: I0130 21:29:27.337770 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqs7s\" (UniqueName: \"kubernetes.io/projected/5fe29cd0-120d-40b6-b9eb-48da17fd1049-kube-api-access-sqs7s\") pod \"openstack-operator-index-mqwrr\" (UID: \"5fe29cd0-120d-40b6-b9eb-48da17fd1049\") " pod="openstack-operators/openstack-operator-index-mqwrr" Jan 30 21:29:27 crc kubenswrapper[4914]: I0130 21:29:27.505431 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mqwrr" Jan 30 21:29:27 crc kubenswrapper[4914]: I0130 21:29:27.549438 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:27 crc kubenswrapper[4914]: I0130 21:29:27.608453 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:27 crc kubenswrapper[4914]: I0130 21:29:27.926219 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mqwrr"] Jan 30 21:29:28 crc kubenswrapper[4914]: I0130 21:29:28.016283 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mqwrr" event={"ID":"5fe29cd0-120d-40b6-b9eb-48da17fd1049","Type":"ContainerStarted","Data":"f58e516cdf9c9adadf1066a5db171c2d12aa1e44bfcbc1a9aa260bde6043b06d"} Jan 30 21:29:30 crc kubenswrapper[4914]: I0130 21:29:30.539730 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mqwrr"] Jan 30 21:29:31 crc kubenswrapper[4914]: I0130 21:29:31.357655 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-fqj8p"] Jan 30 21:29:31 crc kubenswrapper[4914]: I0130 21:29:31.360393 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fqj8p" Jan 30 21:29:31 crc kubenswrapper[4914]: I0130 21:29:31.365568 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fqj8p"] Jan 30 21:29:31 crc kubenswrapper[4914]: I0130 21:29:31.461843 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8892\" (UniqueName: \"kubernetes.io/projected/a7dd7e37-8bb3-44c5-982e-021408582fc6-kube-api-access-q8892\") pod \"openstack-operator-index-fqj8p\" (UID: \"a7dd7e37-8bb3-44c5-982e-021408582fc6\") " pod="openstack-operators/openstack-operator-index-fqj8p" Jan 30 21:29:31 crc kubenswrapper[4914]: I0130 21:29:31.563410 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8892\" (UniqueName: \"kubernetes.io/projected/a7dd7e37-8bb3-44c5-982e-021408582fc6-kube-api-access-q8892\") pod \"openstack-operator-index-fqj8p\" (UID: \"a7dd7e37-8bb3-44c5-982e-021408582fc6\") " pod="openstack-operators/openstack-operator-index-fqj8p" Jan 30 21:29:31 crc kubenswrapper[4914]: I0130 21:29:31.594563 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8892\" (UniqueName: \"kubernetes.io/projected/a7dd7e37-8bb3-44c5-982e-021408582fc6-kube-api-access-q8892\") pod \"openstack-operator-index-fqj8p\" (UID: \"a7dd7e37-8bb3-44c5-982e-021408582fc6\") " pod="openstack-operators/openstack-operator-index-fqj8p" Jan 30 21:29:31 crc kubenswrapper[4914]: I0130 21:29:31.691991 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fqj8p" Jan 30 21:29:32 crc kubenswrapper[4914]: I0130 21:29:32.562002 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-8tvn9" Jan 30 21:29:32 crc kubenswrapper[4914]: I0130 21:29:32.707751 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-8624b" Jan 30 21:29:34 crc kubenswrapper[4914]: I0130 21:29:34.131793 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fqj8p"] Jan 30 21:29:34 crc kubenswrapper[4914]: I0130 21:29:34.755488 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-28dfs"] Jan 30 21:29:34 crc kubenswrapper[4914]: I0130 21:29:34.757508 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:34 crc kubenswrapper[4914]: I0130 21:29:34.775931 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-28dfs"] Jan 30 21:29:34 crc kubenswrapper[4914]: I0130 21:29:34.814486 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkv2w\" (UniqueName: \"kubernetes.io/projected/5a28cf6d-31c5-4884-a167-2725c6700e42-kube-api-access-dkv2w\") pod \"community-operators-28dfs\" (UID: \"5a28cf6d-31c5-4884-a167-2725c6700e42\") " pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:34 crc kubenswrapper[4914]: I0130 21:29:34.814558 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a28cf6d-31c5-4884-a167-2725c6700e42-utilities\") pod \"community-operators-28dfs\" (UID: \"5a28cf6d-31c5-4884-a167-2725c6700e42\") " pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:34 crc kubenswrapper[4914]: I0130 21:29:34.814640 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a28cf6d-31c5-4884-a167-2725c6700e42-catalog-content\") pod \"community-operators-28dfs\" (UID: \"5a28cf6d-31c5-4884-a167-2725c6700e42\") " pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:34 crc kubenswrapper[4914]: I0130 21:29:34.915695 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a28cf6d-31c5-4884-a167-2725c6700e42-catalog-content\") pod \"community-operators-28dfs\" (UID: \"5a28cf6d-31c5-4884-a167-2725c6700e42\") " pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:34 crc kubenswrapper[4914]: I0130 21:29:34.916044 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkv2w\" (UniqueName: \"kubernetes.io/projected/5a28cf6d-31c5-4884-a167-2725c6700e42-kube-api-access-dkv2w\") pod \"community-operators-28dfs\" (UID: \"5a28cf6d-31c5-4884-a167-2725c6700e42\") " pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:34 crc kubenswrapper[4914]: I0130 21:29:34.916084 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a28cf6d-31c5-4884-a167-2725c6700e42-utilities\") pod \"community-operators-28dfs\" (UID: \"5a28cf6d-31c5-4884-a167-2725c6700e42\") " pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:34 crc kubenswrapper[4914]: I0130 21:29:34.916532 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a28cf6d-31c5-4884-a167-2725c6700e42-catalog-content\") pod \"community-operators-28dfs\" (UID: \"5a28cf6d-31c5-4884-a167-2725c6700e42\") " pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:34 crc kubenswrapper[4914]: I0130 21:29:34.916572 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a28cf6d-31c5-4884-a167-2725c6700e42-utilities\") pod \"community-operators-28dfs\" (UID: \"5a28cf6d-31c5-4884-a167-2725c6700e42\") " pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:34 crc kubenswrapper[4914]: I0130 21:29:34.934628 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkv2w\" (UniqueName: \"kubernetes.io/projected/5a28cf6d-31c5-4884-a167-2725c6700e42-kube-api-access-dkv2w\") pod \"community-operators-28dfs\" (UID: \"5a28cf6d-31c5-4884-a167-2725c6700e42\") " pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:35 crc kubenswrapper[4914]: I0130 21:29:35.077210 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:35 crc kubenswrapper[4914]: W0130 21:29:35.861200 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7dd7e37_8bb3_44c5_982e_021408582fc6.slice/crio-5cd5ff3a66ba19ecb3d4b8c1603ac0a225687ce5a43f85f15df09ba5a412051a WatchSource:0}: Error finding container 5cd5ff3a66ba19ecb3d4b8c1603ac0a225687ce5a43f85f15df09ba5a412051a: Status 404 returned error can't find the container with id 5cd5ff3a66ba19ecb3d4b8c1603ac0a225687ce5a43f85f15df09ba5a412051a Jan 30 21:29:36 crc kubenswrapper[4914]: I0130 21:29:36.114098 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mqwrr" event={"ID":"5fe29cd0-120d-40b6-b9eb-48da17fd1049","Type":"ContainerStarted","Data":"acc35472e9b152d51f11c57076679dda4104c23fda5007db13bc0f6cb96ff3f7"} Jan 30 21:29:36 crc kubenswrapper[4914]: I0130 21:29:36.114207 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-mqwrr" podUID="5fe29cd0-120d-40b6-b9eb-48da17fd1049" containerName="registry-server" containerID="cri-o://acc35472e9b152d51f11c57076679dda4104c23fda5007db13bc0f6cb96ff3f7" gracePeriod=2 Jan 30 21:29:36 crc kubenswrapper[4914]: I0130 21:29:36.118214 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fqj8p" event={"ID":"a7dd7e37-8bb3-44c5-982e-021408582fc6","Type":"ContainerStarted","Data":"5cd5ff3a66ba19ecb3d4b8c1603ac0a225687ce5a43f85f15df09ba5a412051a"} Jan 30 21:29:36 crc kubenswrapper[4914]: I0130 21:29:36.136717 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mqwrr" podStartSLOduration=1.192764505 podStartE2EDuration="9.136680051s" podCreationTimestamp="2026-01-30 21:29:27 +0000 UTC" firstStartedPulling="2026-01-30 21:29:27.932328647 +0000 UTC m=+901.370965408" lastFinishedPulling="2026-01-30 21:29:35.876244193 +0000 UTC m=+909.314880954" observedRunningTime="2026-01-30 21:29:36.131911336 +0000 UTC m=+909.570548107" watchObservedRunningTime="2026-01-30 21:29:36.136680051 +0000 UTC m=+909.575316822" Jan 30 21:29:36 crc kubenswrapper[4914]: I0130 21:29:36.337968 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-28dfs"] Jan 30 21:29:36 crc kubenswrapper[4914]: W0130 21:29:36.353528 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a28cf6d_31c5_4884_a167_2725c6700e42.slice/crio-e62f30ea2125da99e420b69d4e62ae478c1ac3b73275d207945a154ffedad067 WatchSource:0}: Error finding container e62f30ea2125da99e420b69d4e62ae478c1ac3b73275d207945a154ffedad067: Status 404 returned error can't find the container with id e62f30ea2125da99e420b69d4e62ae478c1ac3b73275d207945a154ffedad067 Jan 30 21:29:36 crc kubenswrapper[4914]: I0130 21:29:36.603212 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mqwrr" Jan 30 21:29:36 crc kubenswrapper[4914]: I0130 21:29:36.741700 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqs7s\" (UniqueName: \"kubernetes.io/projected/5fe29cd0-120d-40b6-b9eb-48da17fd1049-kube-api-access-sqs7s\") pod \"5fe29cd0-120d-40b6-b9eb-48da17fd1049\" (UID: \"5fe29cd0-120d-40b6-b9eb-48da17fd1049\") " Jan 30 21:29:36 crc kubenswrapper[4914]: I0130 21:29:36.747407 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe29cd0-120d-40b6-b9eb-48da17fd1049-kube-api-access-sqs7s" (OuterVolumeSpecName: "kube-api-access-sqs7s") pod "5fe29cd0-120d-40b6-b9eb-48da17fd1049" (UID: "5fe29cd0-120d-40b6-b9eb-48da17fd1049"). InnerVolumeSpecName "kube-api-access-sqs7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:29:36 crc kubenswrapper[4914]: I0130 21:29:36.843628 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqs7s\" (UniqueName: \"kubernetes.io/projected/5fe29cd0-120d-40b6-b9eb-48da17fd1049-kube-api-access-sqs7s\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.126782 4914 generic.go:334] "Generic (PLEG): container finished" podID="5fe29cd0-120d-40b6-b9eb-48da17fd1049" containerID="acc35472e9b152d51f11c57076679dda4104c23fda5007db13bc0f6cb96ff3f7" exitCode=0 Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.126844 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mqwrr" event={"ID":"5fe29cd0-120d-40b6-b9eb-48da17fd1049","Type":"ContainerDied","Data":"acc35472e9b152d51f11c57076679dda4104c23fda5007db13bc0f6cb96ff3f7"} Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.126917 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mqwrr" event={"ID":"5fe29cd0-120d-40b6-b9eb-48da17fd1049","Type":"ContainerDied","Data":"f58e516cdf9c9adadf1066a5db171c2d12aa1e44bfcbc1a9aa260bde6043b06d"} Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.126949 4914 scope.go:117] "RemoveContainer" containerID="acc35472e9b152d51f11c57076679dda4104c23fda5007db13bc0f6cb96ff3f7" Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.128486 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fqj8p" event={"ID":"a7dd7e37-8bb3-44c5-982e-021408582fc6","Type":"ContainerStarted","Data":"ceaf78a67b5b5941997cb8276bfb7efe218f33c893442c379da420780d4e9913"} Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.129873 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mqwrr" Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.133205 4914 generic.go:334] "Generic (PLEG): container finished" podID="5a28cf6d-31c5-4884-a167-2725c6700e42" containerID="95522bc6e8e339452af42f6c301efd64f65c5391944fb6cd57484af3546dd346" exitCode=0 Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.133241 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28dfs" event={"ID":"5a28cf6d-31c5-4884-a167-2725c6700e42","Type":"ContainerDied","Data":"95522bc6e8e339452af42f6c301efd64f65c5391944fb6cd57484af3546dd346"} Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.133262 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28dfs" event={"ID":"5a28cf6d-31c5-4884-a167-2725c6700e42","Type":"ContainerStarted","Data":"e62f30ea2125da99e420b69d4e62ae478c1ac3b73275d207945a154ffedad067"} Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.154355 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-fqj8p" podStartSLOduration=5.910419963 podStartE2EDuration="6.154334623s" podCreationTimestamp="2026-01-30 21:29:31 +0000 UTC" firstStartedPulling="2026-01-30 21:29:35.87449065 +0000 UTC m=+909.313127401" lastFinishedPulling="2026-01-30 21:29:36.11840529 +0000 UTC m=+909.557042061" observedRunningTime="2026-01-30 21:29:37.148606485 +0000 UTC m=+910.587243256" watchObservedRunningTime="2026-01-30 21:29:37.154334623 +0000 UTC m=+910.592971394" Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.155013 4914 scope.go:117] "RemoveContainer" containerID="acc35472e9b152d51f11c57076679dda4104c23fda5007db13bc0f6cb96ff3f7" Jan 30 21:29:37 crc kubenswrapper[4914]: E0130 21:29:37.156974 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acc35472e9b152d51f11c57076679dda4104c23fda5007db13bc0f6cb96ff3f7\": container with ID starting with acc35472e9b152d51f11c57076679dda4104c23fda5007db13bc0f6cb96ff3f7 not found: ID does not exist" containerID="acc35472e9b152d51f11c57076679dda4104c23fda5007db13bc0f6cb96ff3f7" Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.157045 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acc35472e9b152d51f11c57076679dda4104c23fda5007db13bc0f6cb96ff3f7"} err="failed to get container status \"acc35472e9b152d51f11c57076679dda4104c23fda5007db13bc0f6cb96ff3f7\": rpc error: code = NotFound desc = could not find container \"acc35472e9b152d51f11c57076679dda4104c23fda5007db13bc0f6cb96ff3f7\": container with ID starting with acc35472e9b152d51f11c57076679dda4104c23fda5007db13bc0f6cb96ff3f7 not found: ID does not exist" Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.202264 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mqwrr"] Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.212296 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-mqwrr"] Jan 30 21:29:37 crc kubenswrapper[4914]: I0130 21:29:37.841610 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe29cd0-120d-40b6-b9eb-48da17fd1049" path="/var/lib/kubelet/pods/5fe29cd0-120d-40b6-b9eb-48da17fd1049/volumes" Jan 30 21:29:41 crc kubenswrapper[4914]: I0130 21:29:41.692840 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-fqj8p" Jan 30 21:29:41 crc kubenswrapper[4914]: I0130 21:29:41.693222 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-fqj8p" Jan 30 21:29:41 crc kubenswrapper[4914]: I0130 21:29:41.739381 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-fqj8p" Jan 30 21:29:41 crc kubenswrapper[4914]: I0130 21:29:41.949661 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7bvtt"] Jan 30 21:29:41 crc kubenswrapper[4914]: E0130 21:29:41.949914 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe29cd0-120d-40b6-b9eb-48da17fd1049" containerName="registry-server" Jan 30 21:29:41 crc kubenswrapper[4914]: I0130 21:29:41.949969 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe29cd0-120d-40b6-b9eb-48da17fd1049" containerName="registry-server" Jan 30 21:29:41 crc kubenswrapper[4914]: I0130 21:29:41.950213 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fe29cd0-120d-40b6-b9eb-48da17fd1049" containerName="registry-server" Jan 30 21:29:41 crc kubenswrapper[4914]: I0130 21:29:41.951067 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:41 crc kubenswrapper[4914]: I0130 21:29:41.969975 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bvtt"] Jan 30 21:29:42 crc kubenswrapper[4914]: I0130 21:29:42.033266 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e1b9ec-e8cb-492e-b115-c258c7237f03-catalog-content\") pod \"redhat-marketplace-7bvtt\" (UID: \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\") " pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:42 crc kubenswrapper[4914]: I0130 21:29:42.033309 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e1b9ec-e8cb-492e-b115-c258c7237f03-utilities\") pod \"redhat-marketplace-7bvtt\" (UID: \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\") " pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:42 crc kubenswrapper[4914]: I0130 21:29:42.033325 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s7jq\" (UniqueName: \"kubernetes.io/projected/b8e1b9ec-e8cb-492e-b115-c258c7237f03-kube-api-access-8s7jq\") pod \"redhat-marketplace-7bvtt\" (UID: \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\") " pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:42 crc kubenswrapper[4914]: I0130 21:29:42.134852 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e1b9ec-e8cb-492e-b115-c258c7237f03-catalog-content\") pod \"redhat-marketplace-7bvtt\" (UID: \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\") " pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:42 crc kubenswrapper[4914]: I0130 21:29:42.134906 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e1b9ec-e8cb-492e-b115-c258c7237f03-utilities\") pod \"redhat-marketplace-7bvtt\" (UID: \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\") " pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:42 crc kubenswrapper[4914]: I0130 21:29:42.134934 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s7jq\" (UniqueName: \"kubernetes.io/projected/b8e1b9ec-e8cb-492e-b115-c258c7237f03-kube-api-access-8s7jq\") pod \"redhat-marketplace-7bvtt\" (UID: \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\") " pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:42 crc kubenswrapper[4914]: I0130 21:29:42.135398 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e1b9ec-e8cb-492e-b115-c258c7237f03-catalog-content\") pod \"redhat-marketplace-7bvtt\" (UID: \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\") " pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:42 crc kubenswrapper[4914]: I0130 21:29:42.136653 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e1b9ec-e8cb-492e-b115-c258c7237f03-utilities\") pod \"redhat-marketplace-7bvtt\" (UID: \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\") " pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:42 crc kubenswrapper[4914]: I0130 21:29:42.160819 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s7jq\" (UniqueName: \"kubernetes.io/projected/b8e1b9ec-e8cb-492e-b115-c258c7237f03-kube-api-access-8s7jq\") pod \"redhat-marketplace-7bvtt\" (UID: \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\") " pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:42 crc kubenswrapper[4914]: I0130 21:29:42.212361 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-fqj8p" Jan 30 21:29:42 crc kubenswrapper[4914]: I0130 21:29:42.291925 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:42 crc kubenswrapper[4914]: I0130 21:29:42.552956 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-srzwt" Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.589931 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd"] Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.592316 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.595639 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8gfrk" Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.608858 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd"] Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.655004 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bvtt"] Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.660526 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9691502f-3160-40b6-9f3c-21a2545b14ac-util\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd\" (UID: \"9691502f-3160-40b6-9f3c-21a2545b14ac\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.660600 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v9tp\" (UniqueName: \"kubernetes.io/projected/9691502f-3160-40b6-9f3c-21a2545b14ac-kube-api-access-5v9tp\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd\" (UID: \"9691502f-3160-40b6-9f3c-21a2545b14ac\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.660728 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9691502f-3160-40b6-9f3c-21a2545b14ac-bundle\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd\" (UID: \"9691502f-3160-40b6-9f3c-21a2545b14ac\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.762197 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v9tp\" (UniqueName: \"kubernetes.io/projected/9691502f-3160-40b6-9f3c-21a2545b14ac-kube-api-access-5v9tp\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd\" (UID: \"9691502f-3160-40b6-9f3c-21a2545b14ac\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.762376 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9691502f-3160-40b6-9f3c-21a2545b14ac-bundle\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd\" (UID: \"9691502f-3160-40b6-9f3c-21a2545b14ac\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.762436 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9691502f-3160-40b6-9f3c-21a2545b14ac-util\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd\" (UID: \"9691502f-3160-40b6-9f3c-21a2545b14ac\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.762870 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9691502f-3160-40b6-9f3c-21a2545b14ac-bundle\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd\" (UID: \"9691502f-3160-40b6-9f3c-21a2545b14ac\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.763129 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9691502f-3160-40b6-9f3c-21a2545b14ac-util\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd\" (UID: \"9691502f-3160-40b6-9f3c-21a2545b14ac\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.786168 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v9tp\" (UniqueName: \"kubernetes.io/projected/9691502f-3160-40b6-9f3c-21a2545b14ac-kube-api-access-5v9tp\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd\" (UID: \"9691502f-3160-40b6-9f3c-21a2545b14ac\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" Jan 30 21:29:43 crc kubenswrapper[4914]: I0130 21:29:43.915384 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" Jan 30 21:29:44 crc kubenswrapper[4914]: I0130 21:29:44.210280 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd"] Jan 30 21:29:44 crc kubenswrapper[4914]: I0130 21:29:44.235380 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" event={"ID":"9691502f-3160-40b6-9f3c-21a2545b14ac","Type":"ContainerStarted","Data":"8c2d58a0caf1c1519adb4e676c4f9b870618a30409ff3ae5ba551828d9225886"} Jan 30 21:29:44 crc kubenswrapper[4914]: I0130 21:29:44.237416 4914 generic.go:334] "Generic (PLEG): container finished" podID="5a28cf6d-31c5-4884-a167-2725c6700e42" containerID="adb8c7d38b45b9321477686b13e4c3c49c9c6a3865db77ebcf8ce2370c048b2e" exitCode=0 Jan 30 21:29:44 crc kubenswrapper[4914]: I0130 21:29:44.237491 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28dfs" event={"ID":"5a28cf6d-31c5-4884-a167-2725c6700e42","Type":"ContainerDied","Data":"adb8c7d38b45b9321477686b13e4c3c49c9c6a3865db77ebcf8ce2370c048b2e"} Jan 30 21:29:44 crc kubenswrapper[4914]: I0130 21:29:44.240456 4914 generic.go:334] "Generic (PLEG): container finished" podID="b8e1b9ec-e8cb-492e-b115-c258c7237f03" containerID="124b633724651788629f7a8a5d75603f5025474830874d3cf13292846a4ed5ab" exitCode=0 Jan 30 21:29:44 crc kubenswrapper[4914]: I0130 21:29:44.240510 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bvtt" event={"ID":"b8e1b9ec-e8cb-492e-b115-c258c7237f03","Type":"ContainerDied","Data":"124b633724651788629f7a8a5d75603f5025474830874d3cf13292846a4ed5ab"} Jan 30 21:29:44 crc kubenswrapper[4914]: I0130 21:29:44.240532 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bvtt" event={"ID":"b8e1b9ec-e8cb-492e-b115-c258c7237f03","Type":"ContainerStarted","Data":"8d17358e2e9dcd442c305f1c457d286f4ae05636923d7c249097f26bc47b0f3e"} Jan 30 21:29:45 crc kubenswrapper[4914]: I0130 21:29:45.249985 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28dfs" event={"ID":"5a28cf6d-31c5-4884-a167-2725c6700e42","Type":"ContainerStarted","Data":"1b593b8f183a2514ce56ac93f6dd42c8ec4f2cd3f55e71ce93556accc1466660"} Jan 30 21:29:45 crc kubenswrapper[4914]: I0130 21:29:45.252737 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bvtt" event={"ID":"b8e1b9ec-e8cb-492e-b115-c258c7237f03","Type":"ContainerStarted","Data":"3196d83fe3f34ef2e939c5251ffb74d7e807b7923282c2dadcdbc22469983f2a"} Jan 30 21:29:45 crc kubenswrapper[4914]: I0130 21:29:45.254557 4914 generic.go:334] "Generic (PLEG): container finished" podID="9691502f-3160-40b6-9f3c-21a2545b14ac" containerID="6c2b554e1b8bc318cf8a35fcf397c3eb6856a0674489f65df4826428eebf866b" exitCode=0 Jan 30 21:29:45 crc kubenswrapper[4914]: I0130 21:29:45.254593 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" event={"ID":"9691502f-3160-40b6-9f3c-21a2545b14ac","Type":"ContainerDied","Data":"6c2b554e1b8bc318cf8a35fcf397c3eb6856a0674489f65df4826428eebf866b"} Jan 30 21:29:45 crc kubenswrapper[4914]: I0130 21:29:45.272862 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-28dfs" podStartSLOduration=3.765264463 podStartE2EDuration="11.272843556s" podCreationTimestamp="2026-01-30 21:29:34 +0000 UTC" firstStartedPulling="2026-01-30 21:29:37.135746064 +0000 UTC m=+910.574382835" lastFinishedPulling="2026-01-30 21:29:44.643325167 +0000 UTC m=+918.081961928" observedRunningTime="2026-01-30 21:29:45.271631857 +0000 UTC m=+918.710268618" watchObservedRunningTime="2026-01-30 21:29:45.272843556 +0000 UTC m=+918.711480327" Jan 30 21:29:46 crc kubenswrapper[4914]: I0130 21:29:46.267067 4914 generic.go:334] "Generic (PLEG): container finished" podID="9691502f-3160-40b6-9f3c-21a2545b14ac" containerID="f8c76bb15b153b75d8f83b0cb1c2ce70433ae9be63d824a139448f8526cd489a" exitCode=0 Jan 30 21:29:46 crc kubenswrapper[4914]: I0130 21:29:46.267397 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" event={"ID":"9691502f-3160-40b6-9f3c-21a2545b14ac","Type":"ContainerDied","Data":"f8c76bb15b153b75d8f83b0cb1c2ce70433ae9be63d824a139448f8526cd489a"} Jan 30 21:29:46 crc kubenswrapper[4914]: I0130 21:29:46.280115 4914 generic.go:334] "Generic (PLEG): container finished" podID="b8e1b9ec-e8cb-492e-b115-c258c7237f03" containerID="3196d83fe3f34ef2e939c5251ffb74d7e807b7923282c2dadcdbc22469983f2a" exitCode=0 Jan 30 21:29:46 crc kubenswrapper[4914]: I0130 21:29:46.280223 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bvtt" event={"ID":"b8e1b9ec-e8cb-492e-b115-c258c7237f03","Type":"ContainerDied","Data":"3196d83fe3f34ef2e939c5251ffb74d7e807b7923282c2dadcdbc22469983f2a"} Jan 30 21:29:47 crc kubenswrapper[4914]: I0130 21:29:47.291371 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bvtt" event={"ID":"b8e1b9ec-e8cb-492e-b115-c258c7237f03","Type":"ContainerStarted","Data":"15797441f34eebcb08cfe3bb7bd2bc69f91bb5d0dd90454c10f27b03b2c4f06c"} Jan 30 21:29:47 crc kubenswrapper[4914]: I0130 21:29:47.294776 4914 generic.go:334] "Generic (PLEG): container finished" podID="9691502f-3160-40b6-9f3c-21a2545b14ac" containerID="19df4c533ee02ca78bcbf019ec8cc465319560364ac2b40d77fd2e7ad95b85e3" exitCode=0 Jan 30 21:29:47 crc kubenswrapper[4914]: I0130 21:29:47.294806 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" event={"ID":"9691502f-3160-40b6-9f3c-21a2545b14ac","Type":"ContainerDied","Data":"19df4c533ee02ca78bcbf019ec8cc465319560364ac2b40d77fd2e7ad95b85e3"} Jan 30 21:29:47 crc kubenswrapper[4914]: I0130 21:29:47.318197 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7bvtt" podStartSLOduration=3.780605592 podStartE2EDuration="6.318164032s" podCreationTimestamp="2026-01-30 21:29:41 +0000 UTC" firstStartedPulling="2026-01-30 21:29:44.241650938 +0000 UTC m=+917.680287699" lastFinishedPulling="2026-01-30 21:29:46.779209388 +0000 UTC m=+920.217846139" observedRunningTime="2026-01-30 21:29:47.313677923 +0000 UTC m=+920.752314684" watchObservedRunningTime="2026-01-30 21:29:47.318164032 +0000 UTC m=+920.756800823" Jan 30 21:29:48 crc kubenswrapper[4914]: I0130 21:29:48.673924 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" Jan 30 21:29:48 crc kubenswrapper[4914]: I0130 21:29:48.745378 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v9tp\" (UniqueName: \"kubernetes.io/projected/9691502f-3160-40b6-9f3c-21a2545b14ac-kube-api-access-5v9tp\") pod \"9691502f-3160-40b6-9f3c-21a2545b14ac\" (UID: \"9691502f-3160-40b6-9f3c-21a2545b14ac\") " Jan 30 21:29:48 crc kubenswrapper[4914]: I0130 21:29:48.745465 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9691502f-3160-40b6-9f3c-21a2545b14ac-bundle\") pod \"9691502f-3160-40b6-9f3c-21a2545b14ac\" (UID: \"9691502f-3160-40b6-9f3c-21a2545b14ac\") " Jan 30 21:29:48 crc kubenswrapper[4914]: I0130 21:29:48.745520 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9691502f-3160-40b6-9f3c-21a2545b14ac-util\") pod \"9691502f-3160-40b6-9f3c-21a2545b14ac\" (UID: \"9691502f-3160-40b6-9f3c-21a2545b14ac\") " Jan 30 21:29:48 crc kubenswrapper[4914]: I0130 21:29:48.746147 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9691502f-3160-40b6-9f3c-21a2545b14ac-bundle" (OuterVolumeSpecName: "bundle") pod "9691502f-3160-40b6-9f3c-21a2545b14ac" (UID: "9691502f-3160-40b6-9f3c-21a2545b14ac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:29:48 crc kubenswrapper[4914]: I0130 21:29:48.761436 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9691502f-3160-40b6-9f3c-21a2545b14ac-util" (OuterVolumeSpecName: "util") pod "9691502f-3160-40b6-9f3c-21a2545b14ac" (UID: "9691502f-3160-40b6-9f3c-21a2545b14ac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:29:48 crc kubenswrapper[4914]: I0130 21:29:48.771583 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9691502f-3160-40b6-9f3c-21a2545b14ac-kube-api-access-5v9tp" (OuterVolumeSpecName: "kube-api-access-5v9tp") pod "9691502f-3160-40b6-9f3c-21a2545b14ac" (UID: "9691502f-3160-40b6-9f3c-21a2545b14ac"). InnerVolumeSpecName "kube-api-access-5v9tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:29:48 crc kubenswrapper[4914]: I0130 21:29:48.847466 4914 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9691502f-3160-40b6-9f3c-21a2545b14ac-util\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:48 crc kubenswrapper[4914]: I0130 21:29:48.847503 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v9tp\" (UniqueName: \"kubernetes.io/projected/9691502f-3160-40b6-9f3c-21a2545b14ac-kube-api-access-5v9tp\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:48 crc kubenswrapper[4914]: I0130 21:29:48.847514 4914 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9691502f-3160-40b6-9f3c-21a2545b14ac-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:49 crc kubenswrapper[4914]: I0130 21:29:49.310323 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" event={"ID":"9691502f-3160-40b6-9f3c-21a2545b14ac","Type":"ContainerDied","Data":"8c2d58a0caf1c1519adb4e676c4f9b870618a30409ff3ae5ba551828d9225886"} Jan 30 21:29:49 crc kubenswrapper[4914]: I0130 21:29:49.310364 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c2d58a0caf1c1519adb4e676c4f9b870618a30409ff3ae5ba551828d9225886" Jan 30 21:29:49 crc kubenswrapper[4914]: I0130 21:29:49.310426 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd" Jan 30 21:29:50 crc kubenswrapper[4914]: I0130 21:29:50.860189 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-55fdcd6c79-z5tf8"] Jan 30 21:29:50 crc kubenswrapper[4914]: E0130 21:29:50.860756 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9691502f-3160-40b6-9f3c-21a2545b14ac" containerName="pull" Jan 30 21:29:50 crc kubenswrapper[4914]: I0130 21:29:50.860771 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9691502f-3160-40b6-9f3c-21a2545b14ac" containerName="pull" Jan 30 21:29:50 crc kubenswrapper[4914]: E0130 21:29:50.860780 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9691502f-3160-40b6-9f3c-21a2545b14ac" containerName="extract" Jan 30 21:29:50 crc kubenswrapper[4914]: I0130 21:29:50.860788 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9691502f-3160-40b6-9f3c-21a2545b14ac" containerName="extract" Jan 30 21:29:50 crc kubenswrapper[4914]: E0130 21:29:50.860803 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9691502f-3160-40b6-9f3c-21a2545b14ac" containerName="util" Jan 30 21:29:50 crc kubenswrapper[4914]: I0130 21:29:50.860810 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9691502f-3160-40b6-9f3c-21a2545b14ac" containerName="util" Jan 30 21:29:50 crc kubenswrapper[4914]: I0130 21:29:50.860955 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="9691502f-3160-40b6-9f3c-21a2545b14ac" containerName="extract" Jan 30 21:29:50 crc kubenswrapper[4914]: I0130 21:29:50.861444 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-z5tf8" Jan 30 21:29:50 crc kubenswrapper[4914]: I0130 21:29:50.866103 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-pssld" Jan 30 21:29:50 crc kubenswrapper[4914]: I0130 21:29:50.873398 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd2mg\" (UniqueName: \"kubernetes.io/projected/d17e19a0-3c34-44d5-8684-f78694dcb2ce-kube-api-access-bd2mg\") pod \"openstack-operator-controller-init-55fdcd6c79-z5tf8\" (UID: \"d17e19a0-3c34-44d5-8684-f78694dcb2ce\") " pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-z5tf8" Jan 30 21:29:50 crc kubenswrapper[4914]: I0130 21:29:50.887696 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-55fdcd6c79-z5tf8"] Jan 30 21:29:50 crc kubenswrapper[4914]: I0130 21:29:50.978835 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd2mg\" (UniqueName: \"kubernetes.io/projected/d17e19a0-3c34-44d5-8684-f78694dcb2ce-kube-api-access-bd2mg\") pod \"openstack-operator-controller-init-55fdcd6c79-z5tf8\" (UID: \"d17e19a0-3c34-44d5-8684-f78694dcb2ce\") " pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-z5tf8" Jan 30 21:29:51 crc kubenswrapper[4914]: I0130 21:29:51.000291 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd2mg\" (UniqueName: \"kubernetes.io/projected/d17e19a0-3c34-44d5-8684-f78694dcb2ce-kube-api-access-bd2mg\") pod \"openstack-operator-controller-init-55fdcd6c79-z5tf8\" (UID: \"d17e19a0-3c34-44d5-8684-f78694dcb2ce\") " pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-z5tf8" Jan 30 21:29:51 crc kubenswrapper[4914]: I0130 21:29:51.180168 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-z5tf8" Jan 30 21:29:51 crc kubenswrapper[4914]: I0130 21:29:51.688095 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-55fdcd6c79-z5tf8"] Jan 30 21:29:51 crc kubenswrapper[4914]: I0130 21:29:51.693143 4914 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:29:52 crc kubenswrapper[4914]: I0130 21:29:52.292743 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:52 crc kubenswrapper[4914]: I0130 21:29:52.293027 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:52 crc kubenswrapper[4914]: I0130 21:29:52.377196 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:52 crc kubenswrapper[4914]: I0130 21:29:52.382508 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-z5tf8" event={"ID":"d17e19a0-3c34-44d5-8684-f78694dcb2ce","Type":"ContainerStarted","Data":"ed1096f20abf54a3277ce23b75a6fc47a800a81511df293cad07921ed618aa23"} Jan 30 21:29:52 crc kubenswrapper[4914]: I0130 21:29:52.421950 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:54 crc kubenswrapper[4914]: I0130 21:29:54.538558 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bvtt"] Jan 30 21:29:54 crc kubenswrapper[4914]: I0130 21:29:54.539347 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7bvtt" podUID="b8e1b9ec-e8cb-492e-b115-c258c7237f03" containerName="registry-server" containerID="cri-o://15797441f34eebcb08cfe3bb7bd2bc69f91bb5d0dd90454c10f27b03b2c4f06c" gracePeriod=2 Jan 30 21:29:55 crc kubenswrapper[4914]: I0130 21:29:55.077358 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:55 crc kubenswrapper[4914]: I0130 21:29:55.077744 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:55 crc kubenswrapper[4914]: I0130 21:29:55.136337 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:55 crc kubenswrapper[4914]: I0130 21:29:55.462956 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-28dfs" Jan 30 21:29:57 crc kubenswrapper[4914]: I0130 21:29:57.437957 4914 generic.go:334] "Generic (PLEG): container finished" podID="b8e1b9ec-e8cb-492e-b115-c258c7237f03" containerID="15797441f34eebcb08cfe3bb7bd2bc69f91bb5d0dd90454c10f27b03b2c4f06c" exitCode=0 Jan 30 21:29:57 crc kubenswrapper[4914]: I0130 21:29:57.438111 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bvtt" event={"ID":"b8e1b9ec-e8cb-492e-b115-c258c7237f03","Type":"ContainerDied","Data":"15797441f34eebcb08cfe3bb7bd2bc69f91bb5d0dd90454c10f27b03b2c4f06c"} Jan 30 21:29:57 crc kubenswrapper[4914]: I0130 21:29:57.731904 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:57 crc kubenswrapper[4914]: I0130 21:29:57.794815 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s7jq\" (UniqueName: \"kubernetes.io/projected/b8e1b9ec-e8cb-492e-b115-c258c7237f03-kube-api-access-8s7jq\") pod \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\" (UID: \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\") " Jan 30 21:29:57 crc kubenswrapper[4914]: I0130 21:29:57.794927 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e1b9ec-e8cb-492e-b115-c258c7237f03-utilities\") pod \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\" (UID: \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\") " Jan 30 21:29:57 crc kubenswrapper[4914]: I0130 21:29:57.794983 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e1b9ec-e8cb-492e-b115-c258c7237f03-catalog-content\") pod \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\" (UID: \"b8e1b9ec-e8cb-492e-b115-c258c7237f03\") " Jan 30 21:29:57 crc kubenswrapper[4914]: I0130 21:29:57.796319 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8e1b9ec-e8cb-492e-b115-c258c7237f03-utilities" (OuterVolumeSpecName: "utilities") pod "b8e1b9ec-e8cb-492e-b115-c258c7237f03" (UID: "b8e1b9ec-e8cb-492e-b115-c258c7237f03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:29:57 crc kubenswrapper[4914]: I0130 21:29:57.800665 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8e1b9ec-e8cb-492e-b115-c258c7237f03-kube-api-access-8s7jq" (OuterVolumeSpecName: "kube-api-access-8s7jq") pod "b8e1b9ec-e8cb-492e-b115-c258c7237f03" (UID: "b8e1b9ec-e8cb-492e-b115-c258c7237f03"). InnerVolumeSpecName "kube-api-access-8s7jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:29:57 crc kubenswrapper[4914]: I0130 21:29:57.816891 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8e1b9ec-e8cb-492e-b115-c258c7237f03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8e1b9ec-e8cb-492e-b115-c258c7237f03" (UID: "b8e1b9ec-e8cb-492e-b115-c258c7237f03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:29:57 crc kubenswrapper[4914]: I0130 21:29:57.896671 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e1b9ec-e8cb-492e-b115-c258c7237f03-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:57 crc kubenswrapper[4914]: I0130 21:29:57.896702 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e1b9ec-e8cb-492e-b115-c258c7237f03-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:57 crc kubenswrapper[4914]: I0130 21:29:57.896729 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s7jq\" (UniqueName: \"kubernetes.io/projected/b8e1b9ec-e8cb-492e-b115-c258c7237f03-kube-api-access-8s7jq\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:58 crc kubenswrapper[4914]: I0130 21:29:58.162078 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-28dfs"] Jan 30 21:29:58 crc kubenswrapper[4914]: I0130 21:29:58.448874 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-z5tf8" event={"ID":"d17e19a0-3c34-44d5-8684-f78694dcb2ce","Type":"ContainerStarted","Data":"78ec6fab52930791040bed12693f6259f8fc11adba6e4c619bebfcbf3025d58f"} Jan 30 21:29:58 crc kubenswrapper[4914]: I0130 21:29:58.449023 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-z5tf8" Jan 30 21:29:58 crc kubenswrapper[4914]: I0130 21:29:58.451461 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7bvtt" Jan 30 21:29:58 crc kubenswrapper[4914]: I0130 21:29:58.451537 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bvtt" event={"ID":"b8e1b9ec-e8cb-492e-b115-c258c7237f03","Type":"ContainerDied","Data":"8d17358e2e9dcd442c305f1c457d286f4ae05636923d7c249097f26bc47b0f3e"} Jan 30 21:29:58 crc kubenswrapper[4914]: I0130 21:29:58.451588 4914 scope.go:117] "RemoveContainer" containerID="15797441f34eebcb08cfe3bb7bd2bc69f91bb5d0dd90454c10f27b03b2c4f06c" Jan 30 21:29:58 crc kubenswrapper[4914]: I0130 21:29:58.482568 4914 scope.go:117] "RemoveContainer" containerID="3196d83fe3f34ef2e939c5251ffb74d7e807b7923282c2dadcdbc22469983f2a" Jan 30 21:29:58 crc kubenswrapper[4914]: I0130 21:29:58.493610 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-z5tf8" podStartSLOduration=2.205218122 podStartE2EDuration="8.493578966s" podCreationTimestamp="2026-01-30 21:29:50 +0000 UTC" firstStartedPulling="2026-01-30 21:29:51.692934223 +0000 UTC m=+925.131570984" lastFinishedPulling="2026-01-30 21:29:57.981295067 +0000 UTC m=+931.419931828" observedRunningTime="2026-01-30 21:29:58.485686555 +0000 UTC m=+931.924323316" watchObservedRunningTime="2026-01-30 21:29:58.493578966 +0000 UTC m=+931.932215757" Jan 30 21:29:58 crc kubenswrapper[4914]: I0130 21:29:58.509463 4914 scope.go:117] "RemoveContainer" containerID="124b633724651788629f7a8a5d75603f5025474830874d3cf13292846a4ed5ab" Jan 30 21:29:58 crc kubenswrapper[4914]: I0130 21:29:58.512440 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bvtt"] Jan 30 21:29:58 crc kubenswrapper[4914]: I0130 21:29:58.519177 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bvtt"] Jan 30 21:29:58 crc kubenswrapper[4914]: I0130 21:29:58.739222 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-485b5"] Jan 30 21:29:58 crc kubenswrapper[4914]: I0130 21:29:58.739486 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-485b5" podUID="2847fa80-29b0-4b80-b48b-04661f64dbc7" containerName="registry-server" containerID="cri-o://287ab6affd14e6654f724d7ea564ec01d16cd7db2f286e32f58ef0497255a7ef" gracePeriod=2 Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.134901 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-485b5" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.216701 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v87g\" (UniqueName: \"kubernetes.io/projected/2847fa80-29b0-4b80-b48b-04661f64dbc7-kube-api-access-6v87g\") pod \"2847fa80-29b0-4b80-b48b-04661f64dbc7\" (UID: \"2847fa80-29b0-4b80-b48b-04661f64dbc7\") " Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.216802 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2847fa80-29b0-4b80-b48b-04661f64dbc7-catalog-content\") pod \"2847fa80-29b0-4b80-b48b-04661f64dbc7\" (UID: \"2847fa80-29b0-4b80-b48b-04661f64dbc7\") " Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.216899 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2847fa80-29b0-4b80-b48b-04661f64dbc7-utilities\") pod \"2847fa80-29b0-4b80-b48b-04661f64dbc7\" (UID: \"2847fa80-29b0-4b80-b48b-04661f64dbc7\") " Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.217851 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2847fa80-29b0-4b80-b48b-04661f64dbc7-utilities" (OuterVolumeSpecName: "utilities") pod "2847fa80-29b0-4b80-b48b-04661f64dbc7" (UID: "2847fa80-29b0-4b80-b48b-04661f64dbc7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.239682 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2847fa80-29b0-4b80-b48b-04661f64dbc7-kube-api-access-6v87g" (OuterVolumeSpecName: "kube-api-access-6v87g") pod "2847fa80-29b0-4b80-b48b-04661f64dbc7" (UID: "2847fa80-29b0-4b80-b48b-04661f64dbc7"). InnerVolumeSpecName "kube-api-access-6v87g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.285102 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2847fa80-29b0-4b80-b48b-04661f64dbc7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2847fa80-29b0-4b80-b48b-04661f64dbc7" (UID: "2847fa80-29b0-4b80-b48b-04661f64dbc7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.318808 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2847fa80-29b0-4b80-b48b-04661f64dbc7-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.318839 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v87g\" (UniqueName: \"kubernetes.io/projected/2847fa80-29b0-4b80-b48b-04661f64dbc7-kube-api-access-6v87g\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.318852 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2847fa80-29b0-4b80-b48b-04661f64dbc7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.458576 4914 generic.go:334] "Generic (PLEG): container finished" podID="2847fa80-29b0-4b80-b48b-04661f64dbc7" containerID="287ab6affd14e6654f724d7ea564ec01d16cd7db2f286e32f58ef0497255a7ef" exitCode=0 Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.458618 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-485b5" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.458668 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-485b5" event={"ID":"2847fa80-29b0-4b80-b48b-04661f64dbc7","Type":"ContainerDied","Data":"287ab6affd14e6654f724d7ea564ec01d16cd7db2f286e32f58ef0497255a7ef"} Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.458744 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-485b5" event={"ID":"2847fa80-29b0-4b80-b48b-04661f64dbc7","Type":"ContainerDied","Data":"c7d6ac4cccdbe19d7e1a9a01cba51e387baf1110ea9fab09c749500967c47b5f"} Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.458776 4914 scope.go:117] "RemoveContainer" containerID="287ab6affd14e6654f724d7ea564ec01d16cd7db2f286e32f58ef0497255a7ef" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.474992 4914 scope.go:117] "RemoveContainer" containerID="57ebeab9b8140202118a8e6d681682890ff4b16828a9806a5fa2c9ba67ff600e" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.495646 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-485b5"] Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.502672 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-485b5"] Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.509317 4914 scope.go:117] "RemoveContainer" containerID="2e9ea8350f7010d9470711da9ffdc433f25945817b6937ac593dcf7b5ccaa1be" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.527928 4914 scope.go:117] "RemoveContainer" containerID="287ab6affd14e6654f724d7ea564ec01d16cd7db2f286e32f58ef0497255a7ef" Jan 30 21:29:59 crc kubenswrapper[4914]: E0130 21:29:59.528350 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"287ab6affd14e6654f724d7ea564ec01d16cd7db2f286e32f58ef0497255a7ef\": container with ID starting with 287ab6affd14e6654f724d7ea564ec01d16cd7db2f286e32f58ef0497255a7ef not found: ID does not exist" containerID="287ab6affd14e6654f724d7ea564ec01d16cd7db2f286e32f58ef0497255a7ef" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.528401 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"287ab6affd14e6654f724d7ea564ec01d16cd7db2f286e32f58ef0497255a7ef"} err="failed to get container status \"287ab6affd14e6654f724d7ea564ec01d16cd7db2f286e32f58ef0497255a7ef\": rpc error: code = NotFound desc = could not find container \"287ab6affd14e6654f724d7ea564ec01d16cd7db2f286e32f58ef0497255a7ef\": container with ID starting with 287ab6affd14e6654f724d7ea564ec01d16cd7db2f286e32f58ef0497255a7ef not found: ID does not exist" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.528432 4914 scope.go:117] "RemoveContainer" containerID="57ebeab9b8140202118a8e6d681682890ff4b16828a9806a5fa2c9ba67ff600e" Jan 30 21:29:59 crc kubenswrapper[4914]: E0130 21:29:59.528821 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57ebeab9b8140202118a8e6d681682890ff4b16828a9806a5fa2c9ba67ff600e\": container with ID starting with 57ebeab9b8140202118a8e6d681682890ff4b16828a9806a5fa2c9ba67ff600e not found: ID does not exist" containerID="57ebeab9b8140202118a8e6d681682890ff4b16828a9806a5fa2c9ba67ff600e" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.528898 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57ebeab9b8140202118a8e6d681682890ff4b16828a9806a5fa2c9ba67ff600e"} err="failed to get container status \"57ebeab9b8140202118a8e6d681682890ff4b16828a9806a5fa2c9ba67ff600e\": rpc error: code = NotFound desc = could not find container \"57ebeab9b8140202118a8e6d681682890ff4b16828a9806a5fa2c9ba67ff600e\": container with ID starting with 57ebeab9b8140202118a8e6d681682890ff4b16828a9806a5fa2c9ba67ff600e not found: ID does not exist" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.528916 4914 scope.go:117] "RemoveContainer" containerID="2e9ea8350f7010d9470711da9ffdc433f25945817b6937ac593dcf7b5ccaa1be" Jan 30 21:29:59 crc kubenswrapper[4914]: E0130 21:29:59.529234 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e9ea8350f7010d9470711da9ffdc433f25945817b6937ac593dcf7b5ccaa1be\": container with ID starting with 2e9ea8350f7010d9470711da9ffdc433f25945817b6937ac593dcf7b5ccaa1be not found: ID does not exist" containerID="2e9ea8350f7010d9470711da9ffdc433f25945817b6937ac593dcf7b5ccaa1be" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.529285 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e9ea8350f7010d9470711da9ffdc433f25945817b6937ac593dcf7b5ccaa1be"} err="failed to get container status \"2e9ea8350f7010d9470711da9ffdc433f25945817b6937ac593dcf7b5ccaa1be\": rpc error: code = NotFound desc = could not find container \"2e9ea8350f7010d9470711da9ffdc433f25945817b6937ac593dcf7b5ccaa1be\": container with ID starting with 2e9ea8350f7010d9470711da9ffdc433f25945817b6937ac593dcf7b5ccaa1be not found: ID does not exist" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.826038 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2847fa80-29b0-4b80-b48b-04661f64dbc7" path="/var/lib/kubelet/pods/2847fa80-29b0-4b80-b48b-04661f64dbc7/volumes" Jan 30 21:29:59 crc kubenswrapper[4914]: I0130 21:29:59.826679 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8e1b9ec-e8cb-492e-b115-c258c7237f03" path="/var/lib/kubelet/pods/b8e1b9ec-e8cb-492e-b115-c258c7237f03/volumes" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.237231 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz"] Jan 30 21:30:00 crc kubenswrapper[4914]: E0130 21:30:00.237876 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e1b9ec-e8cb-492e-b115-c258c7237f03" containerName="registry-server" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.237900 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e1b9ec-e8cb-492e-b115-c258c7237f03" containerName="registry-server" Jan 30 21:30:00 crc kubenswrapper[4914]: E0130 21:30:00.237920 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2847fa80-29b0-4b80-b48b-04661f64dbc7" containerName="extract-utilities" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.237930 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="2847fa80-29b0-4b80-b48b-04661f64dbc7" containerName="extract-utilities" Jan 30 21:30:00 crc kubenswrapper[4914]: E0130 21:30:00.237944 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2847fa80-29b0-4b80-b48b-04661f64dbc7" containerName="extract-content" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.237953 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="2847fa80-29b0-4b80-b48b-04661f64dbc7" containerName="extract-content" Jan 30 21:30:00 crc kubenswrapper[4914]: E0130 21:30:00.237971 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e1b9ec-e8cb-492e-b115-c258c7237f03" containerName="extract-utilities" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.237978 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e1b9ec-e8cb-492e-b115-c258c7237f03" containerName="extract-utilities" Jan 30 21:30:00 crc kubenswrapper[4914]: E0130 21:30:00.237999 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2847fa80-29b0-4b80-b48b-04661f64dbc7" containerName="registry-server" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.238007 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="2847fa80-29b0-4b80-b48b-04661f64dbc7" containerName="registry-server" Jan 30 21:30:00 crc kubenswrapper[4914]: E0130 21:30:00.238021 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e1b9ec-e8cb-492e-b115-c258c7237f03" containerName="extract-content" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.238028 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e1b9ec-e8cb-492e-b115-c258c7237f03" containerName="extract-content" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.238322 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="2847fa80-29b0-4b80-b48b-04661f64dbc7" containerName="registry-server" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.238336 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8e1b9ec-e8cb-492e-b115-c258c7237f03" containerName="registry-server" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.239287 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.245917 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.245998 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.259991 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz"] Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.332482 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11441868-74c0-4ddc-af04-146296bfe8ed-config-volume\") pod \"collect-profiles-29496810-22trz\" (UID: \"11441868-74c0-4ddc-af04-146296bfe8ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.332926 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11441868-74c0-4ddc-af04-146296bfe8ed-secret-volume\") pod \"collect-profiles-29496810-22trz\" (UID: \"11441868-74c0-4ddc-af04-146296bfe8ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.333025 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pwbh\" (UniqueName: \"kubernetes.io/projected/11441868-74c0-4ddc-af04-146296bfe8ed-kube-api-access-5pwbh\") pod \"collect-profiles-29496810-22trz\" (UID: \"11441868-74c0-4ddc-af04-146296bfe8ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.434633 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11441868-74c0-4ddc-af04-146296bfe8ed-config-volume\") pod \"collect-profiles-29496810-22trz\" (UID: \"11441868-74c0-4ddc-af04-146296bfe8ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.434763 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11441868-74c0-4ddc-af04-146296bfe8ed-secret-volume\") pod \"collect-profiles-29496810-22trz\" (UID: \"11441868-74c0-4ddc-af04-146296bfe8ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.434812 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pwbh\" (UniqueName: \"kubernetes.io/projected/11441868-74c0-4ddc-af04-146296bfe8ed-kube-api-access-5pwbh\") pod \"collect-profiles-29496810-22trz\" (UID: \"11441868-74c0-4ddc-af04-146296bfe8ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.435643 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11441868-74c0-4ddc-af04-146296bfe8ed-config-volume\") pod \"collect-profiles-29496810-22trz\" (UID: \"11441868-74c0-4ddc-af04-146296bfe8ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.441429 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11441868-74c0-4ddc-af04-146296bfe8ed-secret-volume\") pod \"collect-profiles-29496810-22trz\" (UID: \"11441868-74c0-4ddc-af04-146296bfe8ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.452352 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pwbh\" (UniqueName: \"kubernetes.io/projected/11441868-74c0-4ddc-af04-146296bfe8ed-kube-api-access-5pwbh\") pod \"collect-profiles-29496810-22trz\" (UID: \"11441868-74c0-4ddc-af04-146296bfe8ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.572153 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" Jan 30 21:30:00 crc kubenswrapper[4914]: I0130 21:30:00.868633 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz"] Jan 30 21:30:01 crc kubenswrapper[4914]: I0130 21:30:01.474521 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" event={"ID":"11441868-74c0-4ddc-af04-146296bfe8ed","Type":"ContainerStarted","Data":"0e4d8f8a5222000df7457f72c6cf2b7e1f81016fe6926304d5b1855f83531346"} Jan 30 21:30:02 crc kubenswrapper[4914]: I0130 21:30:02.483613 4914 generic.go:334] "Generic (PLEG): container finished" podID="11441868-74c0-4ddc-af04-146296bfe8ed" containerID="cef95861187d59ad6c1dfbf677e944d468dd9847157e32103631e77ed82f581e" exitCode=0 Jan 30 21:30:02 crc kubenswrapper[4914]: I0130 21:30:02.483878 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" event={"ID":"11441868-74c0-4ddc-af04-146296bfe8ed","Type":"ContainerDied","Data":"cef95861187d59ad6c1dfbf677e944d468dd9847157e32103631e77ed82f581e"} Jan 30 21:30:03 crc kubenswrapper[4914]: I0130 21:30:03.827052 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" Jan 30 21:30:03 crc kubenswrapper[4914]: I0130 21:30:03.876759 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11441868-74c0-4ddc-af04-146296bfe8ed-secret-volume\") pod \"11441868-74c0-4ddc-af04-146296bfe8ed\" (UID: \"11441868-74c0-4ddc-af04-146296bfe8ed\") " Jan 30 21:30:03 crc kubenswrapper[4914]: I0130 21:30:03.876796 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11441868-74c0-4ddc-af04-146296bfe8ed-config-volume\") pod \"11441868-74c0-4ddc-af04-146296bfe8ed\" (UID: \"11441868-74c0-4ddc-af04-146296bfe8ed\") " Jan 30 21:30:03 crc kubenswrapper[4914]: I0130 21:30:03.876858 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pwbh\" (UniqueName: \"kubernetes.io/projected/11441868-74c0-4ddc-af04-146296bfe8ed-kube-api-access-5pwbh\") pod \"11441868-74c0-4ddc-af04-146296bfe8ed\" (UID: \"11441868-74c0-4ddc-af04-146296bfe8ed\") " Jan 30 21:30:03 crc kubenswrapper[4914]: I0130 21:30:03.877633 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11441868-74c0-4ddc-af04-146296bfe8ed-config-volume" (OuterVolumeSpecName: "config-volume") pod "11441868-74c0-4ddc-af04-146296bfe8ed" (UID: "11441868-74c0-4ddc-af04-146296bfe8ed"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:30:03 crc kubenswrapper[4914]: I0130 21:30:03.886604 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11441868-74c0-4ddc-af04-146296bfe8ed-kube-api-access-5pwbh" (OuterVolumeSpecName: "kube-api-access-5pwbh") pod "11441868-74c0-4ddc-af04-146296bfe8ed" (UID: "11441868-74c0-4ddc-af04-146296bfe8ed"). InnerVolumeSpecName "kube-api-access-5pwbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:30:03 crc kubenswrapper[4914]: I0130 21:30:03.897676 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11441868-74c0-4ddc-af04-146296bfe8ed-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "11441868-74c0-4ddc-af04-146296bfe8ed" (UID: "11441868-74c0-4ddc-af04-146296bfe8ed"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:30:03 crc kubenswrapper[4914]: I0130 21:30:03.978800 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pwbh\" (UniqueName: \"kubernetes.io/projected/11441868-74c0-4ddc-af04-146296bfe8ed-kube-api-access-5pwbh\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:03 crc kubenswrapper[4914]: I0130 21:30:03.978833 4914 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11441868-74c0-4ddc-af04-146296bfe8ed-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:03 crc kubenswrapper[4914]: I0130 21:30:03.978846 4914 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11441868-74c0-4ddc-af04-146296bfe8ed-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:04 crc kubenswrapper[4914]: I0130 21:30:04.501949 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" event={"ID":"11441868-74c0-4ddc-af04-146296bfe8ed","Type":"ContainerDied","Data":"0e4d8f8a5222000df7457f72c6cf2b7e1f81016fe6926304d5b1855f83531346"} Jan 30 21:30:04 crc kubenswrapper[4914]: I0130 21:30:04.502011 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e4d8f8a5222000df7457f72c6cf2b7e1f81016fe6926304d5b1855f83531346" Jan 30 21:30:04 crc kubenswrapper[4914]: I0130 21:30:04.502019 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz" Jan 30 21:30:11 crc kubenswrapper[4914]: I0130 21:30:11.185173 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-z5tf8" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.284806 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4hvtw"] Jan 30 21:30:26 crc kubenswrapper[4914]: E0130 21:30:26.286343 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11441868-74c0-4ddc-af04-146296bfe8ed" containerName="collect-profiles" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.286374 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="11441868-74c0-4ddc-af04-146296bfe8ed" containerName="collect-profiles" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.286695 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="11441868-74c0-4ddc-af04-146296bfe8ed" containerName="collect-profiles" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.288867 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.291247 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hvtw"] Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.398987 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cd8e563-19a1-460d-8a83-1b0d22d6212d-catalog-content\") pod \"certified-operators-4hvtw\" (UID: \"5cd8e563-19a1-460d-8a83-1b0d22d6212d\") " pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.399390 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmzj9\" (UniqueName: \"kubernetes.io/projected/5cd8e563-19a1-460d-8a83-1b0d22d6212d-kube-api-access-bmzj9\") pod \"certified-operators-4hvtw\" (UID: \"5cd8e563-19a1-460d-8a83-1b0d22d6212d\") " pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.399422 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cd8e563-19a1-460d-8a83-1b0d22d6212d-utilities\") pod \"certified-operators-4hvtw\" (UID: \"5cd8e563-19a1-460d-8a83-1b0d22d6212d\") " pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.501199 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cd8e563-19a1-460d-8a83-1b0d22d6212d-catalog-content\") pod \"certified-operators-4hvtw\" (UID: \"5cd8e563-19a1-460d-8a83-1b0d22d6212d\") " pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.501297 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmzj9\" (UniqueName: \"kubernetes.io/projected/5cd8e563-19a1-460d-8a83-1b0d22d6212d-kube-api-access-bmzj9\") pod \"certified-operators-4hvtw\" (UID: \"5cd8e563-19a1-460d-8a83-1b0d22d6212d\") " pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.501321 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cd8e563-19a1-460d-8a83-1b0d22d6212d-utilities\") pod \"certified-operators-4hvtw\" (UID: \"5cd8e563-19a1-460d-8a83-1b0d22d6212d\") " pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.501930 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cd8e563-19a1-460d-8a83-1b0d22d6212d-catalog-content\") pod \"certified-operators-4hvtw\" (UID: \"5cd8e563-19a1-460d-8a83-1b0d22d6212d\") " pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.501974 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cd8e563-19a1-460d-8a83-1b0d22d6212d-utilities\") pod \"certified-operators-4hvtw\" (UID: \"5cd8e563-19a1-460d-8a83-1b0d22d6212d\") " pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.535247 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmzj9\" (UniqueName: \"kubernetes.io/projected/5cd8e563-19a1-460d-8a83-1b0d22d6212d-kube-api-access-bmzj9\") pod \"certified-operators-4hvtw\" (UID: \"5cd8e563-19a1-460d-8a83-1b0d22d6212d\") " pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.610150 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:26 crc kubenswrapper[4914]: I0130 21:30:26.962022 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hvtw"] Jan 30 21:30:26 crc kubenswrapper[4914]: W0130 21:30:26.970848 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cd8e563_19a1_460d_8a83_1b0d22d6212d.slice/crio-3c68b070415f0814e733121d2b470db2fdb242f357882ebd0e71310ef453e455 WatchSource:0}: Error finding container 3c68b070415f0814e733121d2b470db2fdb242f357882ebd0e71310ef453e455: Status 404 returned error can't find the container with id 3c68b070415f0814e733121d2b470db2fdb242f357882ebd0e71310ef453e455 Jan 30 21:30:27 crc kubenswrapper[4914]: I0130 21:30:27.714043 4914 generic.go:334] "Generic (PLEG): container finished" podID="5cd8e563-19a1-460d-8a83-1b0d22d6212d" containerID="6ba09997329c486849e76b059d0b120fbf924e351ce88dad78856c5fe3417ef6" exitCode=0 Jan 30 21:30:27 crc kubenswrapper[4914]: I0130 21:30:27.714186 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hvtw" event={"ID":"5cd8e563-19a1-460d-8a83-1b0d22d6212d","Type":"ContainerDied","Data":"6ba09997329c486849e76b059d0b120fbf924e351ce88dad78856c5fe3417ef6"} Jan 30 21:30:27 crc kubenswrapper[4914]: I0130 21:30:27.714404 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hvtw" event={"ID":"5cd8e563-19a1-460d-8a83-1b0d22d6212d","Type":"ContainerStarted","Data":"3c68b070415f0814e733121d2b470db2fdb242f357882ebd0e71310ef453e455"} Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.692114 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.693590 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.699254 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-pmb64" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.712469 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.713483 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.717346 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-wflq2" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.728770 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.733090 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.734148 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.736816 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-42vft" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.745232 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.780797 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-hrq78"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.781867 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-hrq78" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.787154 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-xlvwq" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.789656 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rz7g\" (UniqueName: \"kubernetes.io/projected/7fa005ac-0d9e-4784-8558-df96b2d54006-kube-api-access-6rz7g\") pod \"designate-operator-controller-manager-6d9697b7f4-54hzc\" (UID: \"7fa005ac-0d9e-4784-8558-df96b2d54006\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.802772 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.838355 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.839292 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-hrq78"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.840962 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.849273 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-kvrcb" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.873567 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.876346 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.887531 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-t2hb2" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.890607 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rz7g\" (UniqueName: \"kubernetes.io/projected/7fa005ac-0d9e-4784-8558-df96b2d54006-kube-api-access-6rz7g\") pod \"designate-operator-controller-manager-6d9697b7f4-54hzc\" (UID: \"7fa005ac-0d9e-4784-8558-df96b2d54006\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.890785 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldzrd\" (UniqueName: \"kubernetes.io/projected/8c596723-7c41-448d-831e-07fa9d1129e9-kube-api-access-ldzrd\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-9jf6b\" (UID: \"8c596723-7c41-448d-831e-07fa9d1129e9\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.890873 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw74l\" (UniqueName: \"kubernetes.io/projected/9a7f7899-f35e-4fef-ba51-82af970498db-kube-api-access-sw74l\") pod \"cinder-operator-controller-manager-8d874c8fc-t225p\" (UID: \"9a7f7899-f35e-4fef-ba51-82af970498db\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.890960 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xnst\" (UniqueName: \"kubernetes.io/projected/3330ea8a-466e-4ba5-ad5e-dcb7859521b0-kube-api-access-5xnst\") pod \"heat-operator-controller-manager-69d6db494d-qz2j7\" (UID: \"3330ea8a-466e-4ba5-ad5e-dcb7859521b0\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.891041 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m579p\" (UniqueName: \"kubernetes.io/projected/1746db5a-3b9a-4d76-b1f3-845b907ccabc-kube-api-access-m579p\") pod \"horizon-operator-controller-manager-5fb775575f-vvdgj\" (UID: \"1746db5a-3b9a-4d76-b1f3-845b907ccabc\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.891120 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45vmd\" (UniqueName: \"kubernetes.io/projected/c6d2cebc-7c79-407e-8f69-6b93ab2b41b7-kube-api-access-45vmd\") pod \"glance-operator-controller-manager-8886f4c47-hrq78\" (UID: \"c6d2cebc-7c79-407e-8f69-6b93ab2b41b7\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-hrq78" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.894026 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.919914 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rz7g\" (UniqueName: \"kubernetes.io/projected/7fa005ac-0d9e-4784-8558-df96b2d54006-kube-api-access-6rz7g\") pod \"designate-operator-controller-manager-6d9697b7f4-54hzc\" (UID: \"7fa005ac-0d9e-4784-8558-df96b2d54006\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.931119 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.953851 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-52vwl"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.954891 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.960602 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.960791 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-cvv5j" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.971769 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.972784 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.975940 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-lwj98" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.983625 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bmgvt"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.985059 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bmgvt" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.990420 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-79djh" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.990654 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-52vwl"] Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.991885 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xnst\" (UniqueName: \"kubernetes.io/projected/3330ea8a-466e-4ba5-ad5e-dcb7859521b0-kube-api-access-5xnst\") pod \"heat-operator-controller-manager-69d6db494d-qz2j7\" (UID: \"3330ea8a-466e-4ba5-ad5e-dcb7859521b0\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.991934 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m579p\" (UniqueName: \"kubernetes.io/projected/1746db5a-3b9a-4d76-b1f3-845b907ccabc-kube-api-access-m579p\") pod \"horizon-operator-controller-manager-5fb775575f-vvdgj\" (UID: \"1746db5a-3b9a-4d76-b1f3-845b907ccabc\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.991965 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8kbp\" (UniqueName: \"kubernetes.io/projected/f96847c8-b695-44a5-8756-b2fe0da5e409-kube-api-access-j8kbp\") pod \"infra-operator-controller-manager-79955696d6-52vwl\" (UID: \"f96847c8-b695-44a5-8756-b2fe0da5e409\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.991986 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45vmd\" (UniqueName: \"kubernetes.io/projected/c6d2cebc-7c79-407e-8f69-6b93ab2b41b7-kube-api-access-45vmd\") pod \"glance-operator-controller-manager-8886f4c47-hrq78\" (UID: \"c6d2cebc-7c79-407e-8f69-6b93ab2b41b7\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-hrq78" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.992026 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2fdw\" (UniqueName: \"kubernetes.io/projected/8419dc35-995b-43a3-82b7-6c2b7eb66d35-kube-api-access-r2fdw\") pod \"keystone-operator-controller-manager-84f48565d4-qc9x5\" (UID: \"8419dc35-995b-43a3-82b7-6c2b7eb66d35\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.992048 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert\") pod \"infra-operator-controller-manager-79955696d6-52vwl\" (UID: \"f96847c8-b695-44a5-8756-b2fe0da5e409\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.992085 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lj8z\" (UniqueName: \"kubernetes.io/projected/91003d71-490f-458c-94e9-d8957d6eaac9-kube-api-access-4lj8z\") pod \"ironic-operator-controller-manager-5f4b8bd54d-bmgvt\" (UID: \"91003d71-490f-458c-94e9-d8957d6eaac9\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bmgvt" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.992102 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldzrd\" (UniqueName: \"kubernetes.io/projected/8c596723-7c41-448d-831e-07fa9d1129e9-kube-api-access-ldzrd\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-9jf6b\" (UID: \"8c596723-7c41-448d-831e-07fa9d1129e9\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b" Jan 30 21:30:31 crc kubenswrapper[4914]: I0130 21:30:31.992119 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw74l\" (UniqueName: \"kubernetes.io/projected/9a7f7899-f35e-4fef-ba51-82af970498db-kube-api-access-sw74l\") pod \"cinder-operator-controller-manager-8d874c8fc-t225p\" (UID: \"9a7f7899-f35e-4fef-ba51-82af970498db\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.001336 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.011275 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m579p\" (UniqueName: \"kubernetes.io/projected/1746db5a-3b9a-4d76-b1f3-845b907ccabc-kube-api-access-m579p\") pod \"horizon-operator-controller-manager-5fb775575f-vvdgj\" (UID: \"1746db5a-3b9a-4d76-b1f3-845b907ccabc\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.011662 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-648g4"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.012150 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw74l\" (UniqueName: \"kubernetes.io/projected/9a7f7899-f35e-4fef-ba51-82af970498db-kube-api-access-sw74l\") pod \"cinder-operator-controller-manager-8d874c8fc-t225p\" (UID: \"9a7f7899-f35e-4fef-ba51-82af970498db\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.014590 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-648g4" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.015354 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45vmd\" (UniqueName: \"kubernetes.io/projected/c6d2cebc-7c79-407e-8f69-6b93ab2b41b7-kube-api-access-45vmd\") pod \"glance-operator-controller-manager-8886f4c47-hrq78\" (UID: \"c6d2cebc-7c79-407e-8f69-6b93ab2b41b7\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-hrq78" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.017171 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldzrd\" (UniqueName: \"kubernetes.io/projected/8c596723-7c41-448d-831e-07fa9d1129e9-kube-api-access-ldzrd\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-9jf6b\" (UID: \"8c596723-7c41-448d-831e-07fa9d1129e9\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.019144 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xnst\" (UniqueName: \"kubernetes.io/projected/3330ea8a-466e-4ba5-ad5e-dcb7859521b0-kube-api-access-5xnst\") pod \"heat-operator-controller-manager-69d6db494d-qz2j7\" (UID: \"3330ea8a-466e-4ba5-ad5e-dcb7859521b0\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.019375 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bmgvt"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.019452 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7tt28" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.019631 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.025815 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-648g4"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.030656 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-kjw6v"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.031442 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-kjw6v" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.034917 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-gfvd5" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.043350 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.044245 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.045624 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-f2qbd" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.051347 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-kjw6v"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.060788 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.063557 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.071914 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.094607 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fdw\" (UniqueName: \"kubernetes.io/projected/8419dc35-995b-43a3-82b7-6c2b7eb66d35-kube-api-access-r2fdw\") pod \"keystone-operator-controller-manager-84f48565d4-qc9x5\" (UID: \"8419dc35-995b-43a3-82b7-6c2b7eb66d35\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.094674 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert\") pod \"infra-operator-controller-manager-79955696d6-52vwl\" (UID: \"f96847c8-b695-44a5-8756-b2fe0da5e409\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.094749 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lj8z\" (UniqueName: \"kubernetes.io/projected/91003d71-490f-458c-94e9-d8957d6eaac9-kube-api-access-4lj8z\") pod \"ironic-operator-controller-manager-5f4b8bd54d-bmgvt\" (UID: \"91003d71-490f-458c-94e9-d8957d6eaac9\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bmgvt" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.094786 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8kbp\" (UniqueName: \"kubernetes.io/projected/f96847c8-b695-44a5-8756-b2fe0da5e409-kube-api-access-j8kbp\") pod \"infra-operator-controller-manager-79955696d6-52vwl\" (UID: \"f96847c8-b695-44a5-8756-b2fe0da5e409\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:30:32 crc kubenswrapper[4914]: E0130 21:30:32.095054 4914 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 21:30:32 crc kubenswrapper[4914]: E0130 21:30:32.095101 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert podName:f96847c8-b695-44a5-8756-b2fe0da5e409 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:32.595085985 +0000 UTC m=+966.033722746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert") pod "infra-operator-controller-manager-79955696d6-52vwl" (UID: "f96847c8-b695-44a5-8756-b2fe0da5e409") : secret "infra-operator-webhook-server-cert" not found Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.106683 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.114279 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-hrq78" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.147021 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2fdw\" (UniqueName: \"kubernetes.io/projected/8419dc35-995b-43a3-82b7-6c2b7eb66d35-kube-api-access-r2fdw\") pod \"keystone-operator-controller-manager-84f48565d4-qc9x5\" (UID: \"8419dc35-995b-43a3-82b7-6c2b7eb66d35\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.150326 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8kbp\" (UniqueName: \"kubernetes.io/projected/f96847c8-b695-44a5-8756-b2fe0da5e409-kube-api-access-j8kbp\") pod \"infra-operator-controller-manager-79955696d6-52vwl\" (UID: \"f96847c8-b695-44a5-8756-b2fe0da5e409\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.154810 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.157859 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.164325 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-89rr2" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.164662 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lj8z\" (UniqueName: \"kubernetes.io/projected/91003d71-490f-458c-94e9-d8957d6eaac9-kube-api-access-4lj8z\") pod \"ironic-operator-controller-manager-5f4b8bd54d-bmgvt\" (UID: \"91003d71-490f-458c-94e9-d8957d6eaac9\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bmgvt" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.189172 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.189190 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.189333 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.189204 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.191447 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-zdd9b" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.196454 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j98n\" (UniqueName: \"kubernetes.io/projected/d505f587-6893-495d-99a0-acbdee4442df-kube-api-access-6j98n\") pod \"neutron-operator-controller-manager-585dbc889-wg7x6\" (UID: \"d505f587-6893-495d-99a0-acbdee4442df\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.196547 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbt9n\" (UniqueName: \"kubernetes.io/projected/0f009ea3-601c-4c2c-bbf4-d300abfe1100-kube-api-access-hbt9n\") pod \"manila-operator-controller-manager-7dd968899f-648g4\" (UID: \"0f009ea3-601c-4c2c-bbf4-d300abfe1100\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-648g4" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.196618 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgrn9\" (UniqueName: \"kubernetes.io/projected/aa6ad36a-4244-490c-970b-52b03b3c3821-kube-api-access-qgrn9\") pod \"mariadb-operator-controller-manager-67bf948998-kjw6v\" (UID: \"aa6ad36a-4244-490c-970b-52b03b3c3821\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-kjw6v" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.227015 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.250909 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-t8sfg"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.252035 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t8sfg" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.262821 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-f6dbz" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.274040 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.275155 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.278073 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-284nh" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.278287 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.294832 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.312247 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98lkt\" (UniqueName: \"kubernetes.io/projected/71273dcf-3d97-4a71-9b09-d8261da90f73-kube-api-access-98lkt\") pod \"octavia-operator-controller-manager-6687f8d877-lxv78\" (UID: \"71273dcf-3d97-4a71-9b09-d8261da90f73\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.321468 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgrn9\" (UniqueName: \"kubernetes.io/projected/aa6ad36a-4244-490c-970b-52b03b3c3821-kube-api-access-qgrn9\") pod \"mariadb-operator-controller-manager-67bf948998-kjw6v\" (UID: \"aa6ad36a-4244-490c-970b-52b03b3c3821\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-kjw6v" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.321888 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j98n\" (UniqueName: \"kubernetes.io/projected/d505f587-6893-495d-99a0-acbdee4442df-kube-api-access-6j98n\") pod \"neutron-operator-controller-manager-585dbc889-wg7x6\" (UID: \"d505f587-6893-495d-99a0-acbdee4442df\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.322003 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcf5v\" (UniqueName: \"kubernetes.io/projected/54b1be48-0cbc-4e1a-be5a-bc6b4cf5df27-kube-api-access-lcf5v\") pod \"nova-operator-controller-manager-55bff696bd-td9qq\" (UID: \"54b1be48-0cbc-4e1a-be5a-bc6b4cf5df27\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.322097 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbt9n\" (UniqueName: \"kubernetes.io/projected/0f009ea3-601c-4c2c-bbf4-d300abfe1100-kube-api-access-hbt9n\") pod \"manila-operator-controller-manager-7dd968899f-648g4\" (UID: \"0f009ea3-601c-4c2c-bbf4-d300abfe1100\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-648g4" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.320347 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-t8sfg"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.343731 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.344899 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.351513 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j98n\" (UniqueName: \"kubernetes.io/projected/d505f587-6893-495d-99a0-acbdee4442df-kube-api-access-6j98n\") pod \"neutron-operator-controller-manager-585dbc889-wg7x6\" (UID: \"d505f587-6893-495d-99a0-acbdee4442df\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.352190 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-ltl2c" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.357245 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbt9n\" (UniqueName: \"kubernetes.io/projected/0f009ea3-601c-4c2c-bbf4-d300abfe1100-kube-api-access-hbt9n\") pod \"manila-operator-controller-manager-7dd968899f-648g4\" (UID: \"0f009ea3-601c-4c2c-bbf4-d300abfe1100\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-648g4" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.357364 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgrn9\" (UniqueName: \"kubernetes.io/projected/aa6ad36a-4244-490c-970b-52b03b3c3821-kube-api-access-qgrn9\") pod \"mariadb-operator-controller-manager-67bf948998-kjw6v\" (UID: \"aa6ad36a-4244-490c-970b-52b03b3c3821\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-kjw6v" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.359679 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.360812 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bmgvt" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.372184 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.379061 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-648g4" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.386676 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-kjw6v" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.390246 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-gltkz"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.392535 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gltkz" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.402026 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-lw8xm" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.402505 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.406044 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-gltkz"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.424929 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmdkd\" (UniqueName: \"kubernetes.io/projected/f23972f8-fa47-444a-b26a-f02086d4f186-kube-api-access-kmdkd\") pod \"placement-operator-controller-manager-5b964cf4cd-wknxv\" (UID: \"f23972f8-fa47-444a-b26a-f02086d4f186\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.425262 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98lkt\" (UniqueName: \"kubernetes.io/projected/71273dcf-3d97-4a71-9b09-d8261da90f73-kube-api-access-98lkt\") pod \"octavia-operator-controller-manager-6687f8d877-lxv78\" (UID: \"71273dcf-3d97-4a71-9b09-d8261da90f73\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.425318 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj\" (UID: \"1d5fa522-5cf7-420f-be41-1d55eb8f1b2c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.425374 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbfbm\" (UniqueName: \"kubernetes.io/projected/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-kube-api-access-qbfbm\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj\" (UID: \"1d5fa522-5cf7-420f-be41-1d55eb8f1b2c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.425421 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krnkl\" (UniqueName: \"kubernetes.io/projected/ccefb276-816b-481e-a645-2bc8b4619d7c-kube-api-access-krnkl\") pod \"swift-operator-controller-manager-68fc8c869-gltkz\" (UID: \"ccefb276-816b-481e-a645-2bc8b4619d7c\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gltkz" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.425480 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv78h\" (UniqueName: \"kubernetes.io/projected/d9414032-0155-4d4b-b456-02ee0f3f4185-kube-api-access-qv78h\") pod \"ovn-operator-controller-manager-788c46999f-t8sfg\" (UID: \"d9414032-0155-4d4b-b456-02ee0f3f4185\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t8sfg" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.425531 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcf5v\" (UniqueName: \"kubernetes.io/projected/54b1be48-0cbc-4e1a-be5a-bc6b4cf5df27-kube-api-access-lcf5v\") pod \"nova-operator-controller-manager-55bff696bd-td9qq\" (UID: \"54b1be48-0cbc-4e1a-be5a-bc6b4cf5df27\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.435483 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.437929 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.440046 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-8sxkt" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.446196 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.449444 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcf5v\" (UniqueName: \"kubernetes.io/projected/54b1be48-0cbc-4e1a-be5a-bc6b4cf5df27-kube-api-access-lcf5v\") pod \"nova-operator-controller-manager-55bff696bd-td9qq\" (UID: \"54b1be48-0cbc-4e1a-be5a-bc6b4cf5df27\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.453288 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98lkt\" (UniqueName: \"kubernetes.io/projected/71273dcf-3d97-4a71-9b09-d8261da90f73-kube-api-access-98lkt\") pod \"octavia-operator-controller-manager-6687f8d877-lxv78\" (UID: \"71273dcf-3d97-4a71-9b09-d8261da90f73\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.468757 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-zdp5q"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.469608 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zdp5q" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.473005 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-sjjqz" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.477634 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-zdp5q"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.488486 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.488595 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-th2ws"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.489431 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-th2ws" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.492312 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-9tfdf" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.498436 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-th2ws"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.525720 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.526517 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.529039 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmzp5\" (UniqueName: \"kubernetes.io/projected/b8e1305a-5b6a-45f3-a228-b16259431de5-kube-api-access-gmzp5\") pod \"telemetry-operator-controller-manager-6749767b8f-rdc4j\" (UID: \"b8e1305a-5b6a-45f3-a228-b16259431de5\") " pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.529079 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbfbm\" (UniqueName: \"kubernetes.io/projected/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-kube-api-access-qbfbm\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj\" (UID: \"1d5fa522-5cf7-420f-be41-1d55eb8f1b2c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.529117 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krnkl\" (UniqueName: \"kubernetes.io/projected/ccefb276-816b-481e-a645-2bc8b4619d7c-kube-api-access-krnkl\") pod \"swift-operator-controller-manager-68fc8c869-gltkz\" (UID: \"ccefb276-816b-481e-a645-2bc8b4619d7c\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gltkz" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.529151 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hmpx\" (UniqueName: \"kubernetes.io/projected/eddacba3-772b-4f09-acfe-60f6c56ba39c-kube-api-access-8hmpx\") pod \"test-operator-controller-manager-56f8bfcd9f-zdp5q\" (UID: \"eddacba3-772b-4f09-acfe-60f6c56ba39c\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zdp5q" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.529180 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv78h\" (UniqueName: \"kubernetes.io/projected/d9414032-0155-4d4b-b456-02ee0f3f4185-kube-api-access-qv78h\") pod \"ovn-operator-controller-manager-788c46999f-t8sfg\" (UID: \"d9414032-0155-4d4b-b456-02ee0f3f4185\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t8sfg" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.529219 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmdkd\" (UniqueName: \"kubernetes.io/projected/f23972f8-fa47-444a-b26a-f02086d4f186-kube-api-access-kmdkd\") pod \"placement-operator-controller-manager-5b964cf4cd-wknxv\" (UID: \"f23972f8-fa47-444a-b26a-f02086d4f186\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.529248 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj\" (UID: \"1d5fa522-5cf7-420f-be41-1d55eb8f1b2c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.529272 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pl4l\" (UniqueName: \"kubernetes.io/projected/8a629cdc-714c-442c-90ae-d20b15d257c6-kube-api-access-7pl4l\") pod \"watcher-operator-controller-manager-564965969-th2ws\" (UID: \"8a629cdc-714c-442c-90ae-d20b15d257c6\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-th2ws" Jan 30 21:30:32 crc kubenswrapper[4914]: E0130 21:30:32.531691 4914 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:30:32 crc kubenswrapper[4914]: E0130 21:30:32.531776 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert podName:1d5fa522-5cf7-420f-be41-1d55eb8f1b2c nodeName:}" failed. No retries permitted until 2026-01-30 21:30:33.031754858 +0000 UTC m=+966.470391619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" (UID: "1d5fa522-5cf7-420f-be41-1d55eb8f1b2c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.534156 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-brvmw" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.534181 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.537215 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.541338 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.549611 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv78h\" (UniqueName: \"kubernetes.io/projected/d9414032-0155-4d4b-b456-02ee0f3f4185-kube-api-access-qv78h\") pod \"ovn-operator-controller-manager-788c46999f-t8sfg\" (UID: \"d9414032-0155-4d4b-b456-02ee0f3f4185\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t8sfg" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.550953 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbfbm\" (UniqueName: \"kubernetes.io/projected/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-kube-api-access-qbfbm\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj\" (UID: \"1d5fa522-5cf7-420f-be41-1d55eb8f1b2c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.552647 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krnkl\" (UniqueName: \"kubernetes.io/projected/ccefb276-816b-481e-a645-2bc8b4619d7c-kube-api-access-krnkl\") pod \"swift-operator-controller-manager-68fc8c869-gltkz\" (UID: \"ccefb276-816b-481e-a645-2bc8b4619d7c\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gltkz" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.552681 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmdkd\" (UniqueName: \"kubernetes.io/projected/f23972f8-fa47-444a-b26a-f02086d4f186-kube-api-access-kmdkd\") pod \"placement-operator-controller-manager-5b964cf4cd-wknxv\" (UID: \"f23972f8-fa47-444a-b26a-f02086d4f186\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.554376 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5l8kc"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.555953 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5l8kc" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.558283 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-fg7r5" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.563971 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.576011 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5l8kc"] Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.597496 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t8sfg" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.630807 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hmpx\" (UniqueName: \"kubernetes.io/projected/eddacba3-772b-4f09-acfe-60f6c56ba39c-kube-api-access-8hmpx\") pod \"test-operator-controller-manager-56f8bfcd9f-zdp5q\" (UID: \"eddacba3-772b-4f09-acfe-60f6c56ba39c\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zdp5q" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.630859 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.630891 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfc4k\" (UniqueName: \"kubernetes.io/projected/31234f15-4801-4840-95e7-e985b1d80aa5-kube-api-access-wfc4k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5l8kc\" (UID: \"31234f15-4801-4840-95e7-e985b1d80aa5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5l8kc" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.630945 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.630994 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk6k9\" (UniqueName: \"kubernetes.io/projected/f2909dee-0316-4626-b532-ebdd66466638-kube-api-access-qk6k9\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.631017 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pl4l\" (UniqueName: \"kubernetes.io/projected/8a629cdc-714c-442c-90ae-d20b15d257c6-kube-api-access-7pl4l\") pod \"watcher-operator-controller-manager-564965969-th2ws\" (UID: \"8a629cdc-714c-442c-90ae-d20b15d257c6\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-th2ws" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.631109 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmzp5\" (UniqueName: \"kubernetes.io/projected/b8e1305a-5b6a-45f3-a228-b16259431de5-kube-api-access-gmzp5\") pod \"telemetry-operator-controller-manager-6749767b8f-rdc4j\" (UID: \"b8e1305a-5b6a-45f3-a228-b16259431de5\") " pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.631157 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert\") pod \"infra-operator-controller-manager-79955696d6-52vwl\" (UID: \"f96847c8-b695-44a5-8756-b2fe0da5e409\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:30:32 crc kubenswrapper[4914]: E0130 21:30:32.631305 4914 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 21:30:32 crc kubenswrapper[4914]: E0130 21:30:32.631365 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert podName:f96847c8-b695-44a5-8756-b2fe0da5e409 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:33.631346663 +0000 UTC m=+967.069983424 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert") pod "infra-operator-controller-manager-79955696d6-52vwl" (UID: "f96847c8-b695-44a5-8756-b2fe0da5e409") : secret "infra-operator-webhook-server-cert" not found Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.649179 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hmpx\" (UniqueName: \"kubernetes.io/projected/eddacba3-772b-4f09-acfe-60f6c56ba39c-kube-api-access-8hmpx\") pod \"test-operator-controller-manager-56f8bfcd9f-zdp5q\" (UID: \"eddacba3-772b-4f09-acfe-60f6c56ba39c\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zdp5q" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.649983 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmzp5\" (UniqueName: \"kubernetes.io/projected/b8e1305a-5b6a-45f3-a228-b16259431de5-kube-api-access-gmzp5\") pod \"telemetry-operator-controller-manager-6749767b8f-rdc4j\" (UID: \"b8e1305a-5b6a-45f3-a228-b16259431de5\") " pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.651172 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pl4l\" (UniqueName: \"kubernetes.io/projected/8a629cdc-714c-442c-90ae-d20b15d257c6-kube-api-access-7pl4l\") pod \"watcher-operator-controller-manager-564965969-th2ws\" (UID: \"8a629cdc-714c-442c-90ae-d20b15d257c6\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-th2ws" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.673353 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.717740 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gltkz" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.731665 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.731724 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk6k9\" (UniqueName: \"kubernetes.io/projected/f2909dee-0316-4626-b532-ebdd66466638-kube-api-access-qk6k9\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.731794 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.731821 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfc4k\" (UniqueName: \"kubernetes.io/projected/31234f15-4801-4840-95e7-e985b1d80aa5-kube-api-access-wfc4k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5l8kc\" (UID: \"31234f15-4801-4840-95e7-e985b1d80aa5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5l8kc" Jan 30 21:30:32 crc kubenswrapper[4914]: E0130 21:30:32.732406 4914 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 21:30:32 crc kubenswrapper[4914]: E0130 21:30:32.732485 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs podName:f2909dee-0316-4626-b532-ebdd66466638 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:33.232465485 +0000 UTC m=+966.671102246 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs") pod "openstack-operator-controller-manager-7d48698d88-mch9s" (UID: "f2909dee-0316-4626-b532-ebdd66466638") : secret "webhook-server-cert" not found Jan 30 21:30:32 crc kubenswrapper[4914]: E0130 21:30:32.732729 4914 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 21:30:32 crc kubenswrapper[4914]: E0130 21:30:32.732758 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs podName:f2909dee-0316-4626-b532-ebdd66466638 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:33.232751292 +0000 UTC m=+966.671388053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs") pod "openstack-operator-controller-manager-7d48698d88-mch9s" (UID: "f2909dee-0316-4626-b532-ebdd66466638") : secret "metrics-server-cert" not found Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.752380 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfc4k\" (UniqueName: \"kubernetes.io/projected/31234f15-4801-4840-95e7-e985b1d80aa5-kube-api-access-wfc4k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5l8kc\" (UID: \"31234f15-4801-4840-95e7-e985b1d80aa5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5l8kc" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.755553 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk6k9\" (UniqueName: \"kubernetes.io/projected/f2909dee-0316-4626-b532-ebdd66466638-kube-api-access-qk6k9\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.765013 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.792947 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zdp5q" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.812838 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-th2ws" Jan 30 21:30:32 crc kubenswrapper[4914]: I0130 21:30:32.881795 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5l8kc" Jan 30 21:30:33 crc kubenswrapper[4914]: I0130 21:30:33.034418 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj\" (UID: \"1d5fa522-5cf7-420f-be41-1d55eb8f1b2c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:30:33 crc kubenswrapper[4914]: E0130 21:30:33.034568 4914 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:30:33 crc kubenswrapper[4914]: E0130 21:30:33.034973 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert podName:1d5fa522-5cf7-420f-be41-1d55eb8f1b2c nodeName:}" failed. No retries permitted until 2026-01-30 21:30:34.034956188 +0000 UTC m=+967.473592949 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" (UID: "1d5fa522-5cf7-420f-be41-1d55eb8f1b2c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:30:33 crc kubenswrapper[4914]: I0130 21:30:33.238520 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:33 crc kubenswrapper[4914]: I0130 21:30:33.238669 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:33 crc kubenswrapper[4914]: E0130 21:30:33.239177 4914 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 21:30:33 crc kubenswrapper[4914]: E0130 21:30:33.239332 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs podName:f2909dee-0316-4626-b532-ebdd66466638 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:34.239305163 +0000 UTC m=+967.677941954 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs") pod "openstack-operator-controller-manager-7d48698d88-mch9s" (UID: "f2909dee-0316-4626-b532-ebdd66466638") : secret "metrics-server-cert" not found Jan 30 21:30:33 crc kubenswrapper[4914]: E0130 21:30:33.239339 4914 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 21:30:33 crc kubenswrapper[4914]: E0130 21:30:33.239456 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs podName:f2909dee-0316-4626-b532-ebdd66466638 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:34.239423665 +0000 UTC m=+967.678060466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs") pod "openstack-operator-controller-manager-7d48698d88-mch9s" (UID: "f2909dee-0316-4626-b532-ebdd66466638") : secret "webhook-server-cert" not found Jan 30 21:30:33 crc kubenswrapper[4914]: I0130 21:30:33.647259 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert\") pod \"infra-operator-controller-manager-79955696d6-52vwl\" (UID: \"f96847c8-b695-44a5-8756-b2fe0da5e409\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:30:33 crc kubenswrapper[4914]: E0130 21:30:33.647476 4914 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 21:30:33 crc kubenswrapper[4914]: E0130 21:30:33.647554 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert podName:f96847c8-b695-44a5-8756-b2fe0da5e409 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:35.64753425 +0000 UTC m=+969.086171011 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert") pod "infra-operator-controller-manager-79955696d6-52vwl" (UID: "f96847c8-b695-44a5-8756-b2fe0da5e409") : secret "infra-operator-webhook-server-cert" not found Jan 30 21:30:34 crc kubenswrapper[4914]: I0130 21:30:34.052570 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj\" (UID: \"1d5fa522-5cf7-420f-be41-1d55eb8f1b2c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:30:34 crc kubenswrapper[4914]: E0130 21:30:34.052751 4914 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:30:34 crc kubenswrapper[4914]: E0130 21:30:34.052815 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert podName:1d5fa522-5cf7-420f-be41-1d55eb8f1b2c nodeName:}" failed. No retries permitted until 2026-01-30 21:30:36.052788135 +0000 UTC m=+969.491424896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" (UID: "1d5fa522-5cf7-420f-be41-1d55eb8f1b2c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:30:34 crc kubenswrapper[4914]: I0130 21:30:34.254759 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:34 crc kubenswrapper[4914]: I0130 21:30:34.255162 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:34 crc kubenswrapper[4914]: E0130 21:30:34.254985 4914 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 21:30:34 crc kubenswrapper[4914]: E0130 21:30:34.255442 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs podName:f2909dee-0316-4626-b532-ebdd66466638 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:36.255422727 +0000 UTC m=+969.694059498 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs") pod "openstack-operator-controller-manager-7d48698d88-mch9s" (UID: "f2909dee-0316-4626-b532-ebdd66466638") : secret "metrics-server-cert" not found Jan 30 21:30:34 crc kubenswrapper[4914]: E0130 21:30:34.255359 4914 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 21:30:34 crc kubenswrapper[4914]: E0130 21:30:34.255927 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs podName:f2909dee-0316-4626-b532-ebdd66466638 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:36.255908629 +0000 UTC m=+969.694545400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs") pod "openstack-operator-controller-manager-7d48698d88-mch9s" (UID: "f2909dee-0316-4626-b532-ebdd66466638") : secret "webhook-server-cert" not found Jan 30 21:30:34 crc kubenswrapper[4914]: I0130 21:30:34.883550 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-zdp5q"] Jan 30 21:30:34 crc kubenswrapper[4914]: I0130 21:30:34.933575 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b"] Jan 30 21:30:34 crc kubenswrapper[4914]: W0130 21:30:34.979657 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeddacba3_772b_4f09_acfe_60f6c56ba39c.slice/crio-b5978a4b7ca0a7464d48506f25838de9a560f177e8f327870274a0ad64feb5ed WatchSource:0}: Error finding container b5978a4b7ca0a7464d48506f25838de9a560f177e8f327870274a0ad64feb5ed: Status 404 returned error can't find the container with id b5978a4b7ca0a7464d48506f25838de9a560f177e8f327870274a0ad64feb5ed Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.331637 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-kjw6v"] Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.332754 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bmgvt"] Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.693368 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert\") pod \"infra-operator-controller-manager-79955696d6-52vwl\" (UID: \"f96847c8-b695-44a5-8756-b2fe0da5e409\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.693593 4914 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.693640 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert podName:f96847c8-b695-44a5-8756-b2fe0da5e409 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:39.693626883 +0000 UTC m=+973.132263644 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert") pod "infra-operator-controller-manager-79955696d6-52vwl" (UID: "f96847c8-b695-44a5-8756-b2fe0da5e409") : secret "infra-operator-webhook-server-cert" not found Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.761607 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-gltkz"] Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.769051 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-hrq78"] Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.773331 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-t8sfg"] Jan 30 21:30:35 crc kubenswrapper[4914]: W0130 21:30:35.786964 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6d2cebc_7c79_407e_8f69_6b93ab2b41b7.slice/crio-0cb703bac2a2473eb547fa37631586c9f6661e2ba163a91d8ca880e34306de81 WatchSource:0}: Error finding container 0cb703bac2a2473eb547fa37631586c9f6661e2ba163a91d8ca880e34306de81: Status 404 returned error can't find the container with id 0cb703bac2a2473eb547fa37631586c9f6661e2ba163a91d8ca880e34306de81 Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.828828 4914 generic.go:334] "Generic (PLEG): container finished" podID="5cd8e563-19a1-460d-8a83-1b0d22d6212d" containerID="04a41297378697a9ab81185570e7141e4eec8bbb8ea9d4efba2f5ad70367fa12" exitCode=0 Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.860059 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78"] Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.860097 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hvtw" event={"ID":"5cd8e563-19a1-460d-8a83-1b0d22d6212d","Type":"ContainerDied","Data":"04a41297378697a9ab81185570e7141e4eec8bbb8ea9d4efba2f5ad70367fa12"} Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.860120 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zdp5q" event={"ID":"eddacba3-772b-4f09-acfe-60f6c56ba39c","Type":"ContainerStarted","Data":"b5978a4b7ca0a7464d48506f25838de9a560f177e8f327870274a0ad64feb5ed"} Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.860130 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b" event={"ID":"8c596723-7c41-448d-831e-07fa9d1129e9","Type":"ContainerStarted","Data":"670566f440f95786b7cb541286643c3adc577f774914007ad07a1e394172afe5"} Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.860140 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-kjw6v" event={"ID":"aa6ad36a-4244-490c-970b-52b03b3c3821","Type":"ContainerStarted","Data":"0607e4c6cbebdbfd1b8d0f3b703d39428bc7363cfb66571a3cbcc707d396fc51"} Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.860151 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gltkz" event={"ID":"ccefb276-816b-481e-a645-2bc8b4619d7c","Type":"ContainerStarted","Data":"44657207bd47bc08d4b8c4fc360bd134f91f677cbb97081b00e0f6a43017603e"} Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.860160 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-hrq78" event={"ID":"c6d2cebc-7c79-407e-8f69-6b93ab2b41b7","Type":"ContainerStarted","Data":"0cb703bac2a2473eb547fa37631586c9f6661e2ba163a91d8ca880e34306de81"} Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.860170 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t8sfg" event={"ID":"d9414032-0155-4d4b-b456-02ee0f3f4185","Type":"ContainerStarted","Data":"8405cee5d1cef3d563e4ec19a8861196d665ea8441b49309d1172ca0a7794692"} Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.860181 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bmgvt" event={"ID":"91003d71-490f-458c-94e9-d8957d6eaac9","Type":"ContainerStarted","Data":"047d9a86653286480fdc3f74d431bcc8e72b377e2a2f5b82fc84ba8e6eabb1ce"} Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.886391 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5l8kc"] Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.919069 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5"] Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.933670 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc"] Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.943575 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq"] Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.950331 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6j98n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-585dbc889-wg7x6_openstack-operators(d505f587-6893-495d-99a0-acbdee4442df): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.950497 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-98lkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6687f8d877-lxv78_openstack-operators(71273dcf-3d97-4a71-9b09-d8261da90f73): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.952831 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6" podUID="d505f587-6893-495d-99a0-acbdee4442df" Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.953040 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78" podUID="71273dcf-3d97-4a71-9b09-d8261da90f73" Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.953386 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kmdkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-wknxv_openstack-operators(f23972f8-fa47-444a-b26a-f02086d4f186): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.953524 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5xnst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-69d6db494d-qz2j7_openstack-operators(3330ea8a-466e-4ba5-ad5e-dcb7859521b0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.954703 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7" podUID="3330ea8a-466e-4ba5-ad5e-dcb7859521b0" Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.954779 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv" podUID="f23972f8-fa47-444a-b26a-f02086d4f186" Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.957490 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:027cd7ab61ef5071d9ad6b729c95a98e51cd254642f01dc019d44cc98a9232f8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m579p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5fb775575f-vvdgj_openstack-operators(1746db5a-3b9a-4d76-b1f3-845b907ccabc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.958809 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj" podUID="1746db5a-3b9a-4d76-b1f3-845b907ccabc" Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.958855 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-648g4"] Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.965284 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-th2ws"] Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.968264 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.103:5001/openstack-k8s-operators/telemetry-operator:a5bcf05e2d71c610156d017fdf197f7c58570d79,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gmzp5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6749767b8f-rdc4j_openstack-operators(b8e1305a-5b6a-45f3-a228-b16259431de5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.969737 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j" podUID="b8e1305a-5b6a-45f3-a228-b16259431de5" Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.970673 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sw74l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-8d874c8fc-t225p_openstack-operators(9a7f7899-f35e-4fef-ba51-82af970498db): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 21:30:35 crc kubenswrapper[4914]: E0130 21:30:35.972621 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p" podUID="9a7f7899-f35e-4fef-ba51-82af970498db" Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.976808 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6"] Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.994124 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv"] Jan 30 21:30:35 crc kubenswrapper[4914]: I0130 21:30:35.999698 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j"] Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.005648 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7"] Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.011682 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p"] Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.019944 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj"] Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.103560 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj\" (UID: \"1d5fa522-5cf7-420f-be41-1d55eb8f1b2c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:30:36 crc kubenswrapper[4914]: E0130 21:30:36.103845 4914 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:30:36 crc kubenswrapper[4914]: E0130 21:30:36.103894 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert podName:1d5fa522-5cf7-420f-be41-1d55eb8f1b2c nodeName:}" failed. No retries permitted until 2026-01-30 21:30:40.103880649 +0000 UTC m=+973.542517410 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" (UID: "1d5fa522-5cf7-420f-be41-1d55eb8f1b2c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.306402 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.306480 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:36 crc kubenswrapper[4914]: E0130 21:30:36.306585 4914 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 21:30:36 crc kubenswrapper[4914]: E0130 21:30:36.306659 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs podName:f2909dee-0316-4626-b532-ebdd66466638 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:40.306644694 +0000 UTC m=+973.745281455 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs") pod "openstack-operator-controller-manager-7d48698d88-mch9s" (UID: "f2909dee-0316-4626-b532-ebdd66466638") : secret "webhook-server-cert" not found Jan 30 21:30:36 crc kubenswrapper[4914]: E0130 21:30:36.306911 4914 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 21:30:36 crc kubenswrapper[4914]: E0130 21:30:36.306994 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs podName:f2909dee-0316-4626-b532-ebdd66466638 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:40.306976462 +0000 UTC m=+973.745613223 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs") pod "openstack-operator-controller-manager-7d48698d88-mch9s" (UID: "f2909dee-0316-4626-b532-ebdd66466638") : secret "metrics-server-cert" not found Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.863050 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p" event={"ID":"9a7f7899-f35e-4fef-ba51-82af970498db","Type":"ContainerStarted","Data":"3fe613feea8535d3364b0fde07238498d39dc7f14d2d2bc19e1b46427b72a4f3"} Jan 30 21:30:36 crc kubenswrapper[4914]: E0130 21:30:36.866360 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p" podUID="9a7f7899-f35e-4fef-ba51-82af970498db" Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.868553 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hvtw" event={"ID":"5cd8e563-19a1-460d-8a83-1b0d22d6212d","Type":"ContainerStarted","Data":"9fb016e88c6f1e80f5e3b07cc27188f01ff6fda37a92114fc2b294e4f00ca23e"} Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.871164 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc" event={"ID":"7fa005ac-0d9e-4784-8558-df96b2d54006","Type":"ContainerStarted","Data":"96ebb2f167cd1b2aa0660527c1437a45624a339501d417063f89b74b143733c2"} Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.872326 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv" event={"ID":"f23972f8-fa47-444a-b26a-f02086d4f186","Type":"ContainerStarted","Data":"5bc3578ff65f2dbf454c06867bdac579c552a2477fbc4730d7672b0d408c63f4"} Jan 30 21:30:36 crc kubenswrapper[4914]: E0130 21:30:36.875876 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv" podUID="f23972f8-fa47-444a-b26a-f02086d4f186" Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.875996 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5l8kc" event={"ID":"31234f15-4801-4840-95e7-e985b1d80aa5","Type":"ContainerStarted","Data":"f3663d8b3cc92cad96015486adb4d3aa028bf7e8c1bf7f03e48e3ab3f2b208b7"} Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.881324 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq" event={"ID":"54b1be48-0cbc-4e1a-be5a-bc6b4cf5df27","Type":"ContainerStarted","Data":"6d939a96c031142494e77b4de174146fc9daf506951cb0a4517be31e16333d09"} Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.892421 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj" event={"ID":"1746db5a-3b9a-4d76-b1f3-845b907ccabc","Type":"ContainerStarted","Data":"f030581ece2dd47f0e4b68a341c94a871a585c2dc797f8bd0931ed50153350ba"} Jan 30 21:30:36 crc kubenswrapper[4914]: E0130 21:30:36.893994 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:027cd7ab61ef5071d9ad6b729c95a98e51cd254642f01dc019d44cc98a9232f8\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj" podUID="1746db5a-3b9a-4d76-b1f3-845b907ccabc" Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.895119 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5" event={"ID":"8419dc35-995b-43a3-82b7-6c2b7eb66d35","Type":"ContainerStarted","Data":"9bd6bb4b4e68f260e52947061eef186f86cf6fa961d06c421f8de0a5e6cf7970"} Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.900428 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4hvtw" podStartSLOduration=2.377197704 podStartE2EDuration="10.900417981s" podCreationTimestamp="2026-01-30 21:30:26 +0000 UTC" firstStartedPulling="2026-01-30 21:30:27.71559042 +0000 UTC m=+961.154227181" lastFinishedPulling="2026-01-30 21:30:36.238810697 +0000 UTC m=+969.677447458" observedRunningTime="2026-01-30 21:30:36.895575815 +0000 UTC m=+970.334212576" watchObservedRunningTime="2026-01-30 21:30:36.900417981 +0000 UTC m=+970.339054742" Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.907390 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j" event={"ID":"b8e1305a-5b6a-45f3-a228-b16259431de5","Type":"ContainerStarted","Data":"6f7562a019486405736296de4a4fee6c3ae558636d4e19d43c997e528683222d"} Jan 30 21:30:36 crc kubenswrapper[4914]: E0130 21:30:36.912832 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.103:5001/openstack-k8s-operators/telemetry-operator:a5bcf05e2d71c610156d017fdf197f7c58570d79\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j" podUID="b8e1305a-5b6a-45f3-a228-b16259431de5" Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.913440 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7" event={"ID":"3330ea8a-466e-4ba5-ad5e-dcb7859521b0","Type":"ContainerStarted","Data":"8e86b9812b0a90d41c58ea44c2a2d1769c0efcb8e5900dfacd6a21e17b6765d1"} Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.917464 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78" event={"ID":"71273dcf-3d97-4a71-9b09-d8261da90f73","Type":"ContainerStarted","Data":"9c208fe5dc94eb11fe09b865c65642f13e24e92367f1cfe9e625231d8b8fd6c4"} Jan 30 21:30:36 crc kubenswrapper[4914]: E0130 21:30:36.918126 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10\\\"\"" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7" podUID="3330ea8a-466e-4ba5-ad5e-dcb7859521b0" Jan 30 21:30:36 crc kubenswrapper[4914]: E0130 21:30:36.918692 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78" podUID="71273dcf-3d97-4a71-9b09-d8261da90f73" Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.918904 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-648g4" event={"ID":"0f009ea3-601c-4c2c-bbf4-d300abfe1100","Type":"ContainerStarted","Data":"197136c637f42104970d96e52f1a7b84613c4e830e4e181aae649f2686425d67"} Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.920559 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-th2ws" event={"ID":"8a629cdc-714c-442c-90ae-d20b15d257c6","Type":"ContainerStarted","Data":"7a3815961bc291041a580c7b4354d1a8f6270ce01b148114f38f0e1cf7940b55"} Jan 30 21:30:36 crc kubenswrapper[4914]: I0130 21:30:36.925959 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6" event={"ID":"d505f587-6893-495d-99a0-acbdee4442df","Type":"ContainerStarted","Data":"ea1bead62d2394fb41965bfc701d2476bde81d292811acda1ffc019303366823"} Jan 30 21:30:36 crc kubenswrapper[4914]: E0130 21:30:36.934634 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6" podUID="d505f587-6893-495d-99a0-acbdee4442df" Jan 30 21:30:37 crc kubenswrapper[4914]: E0130 21:30:37.949157 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:027cd7ab61ef5071d9ad6b729c95a98e51cd254642f01dc019d44cc98a9232f8\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj" podUID="1746db5a-3b9a-4d76-b1f3-845b907ccabc" Jan 30 21:30:37 crc kubenswrapper[4914]: E0130 21:30:37.949243 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78" podUID="71273dcf-3d97-4a71-9b09-d8261da90f73" Jan 30 21:30:37 crc kubenswrapper[4914]: E0130 21:30:37.949403 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.103:5001/openstack-k8s-operators/telemetry-operator:a5bcf05e2d71c610156d017fdf197f7c58570d79\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j" podUID="b8e1305a-5b6a-45f3-a228-b16259431de5" Jan 30 21:30:37 crc kubenswrapper[4914]: E0130 21:30:37.949636 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p" podUID="9a7f7899-f35e-4fef-ba51-82af970498db" Jan 30 21:30:37 crc kubenswrapper[4914]: E0130 21:30:37.950303 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv" podUID="f23972f8-fa47-444a-b26a-f02086d4f186" Jan 30 21:30:37 crc kubenswrapper[4914]: E0130 21:30:37.951199 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6" podUID="d505f587-6893-495d-99a0-acbdee4442df" Jan 30 21:30:37 crc kubenswrapper[4914]: E0130 21:30:37.951255 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10\\\"\"" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7" podUID="3330ea8a-466e-4ba5-ad5e-dcb7859521b0" Jan 30 21:30:39 crc kubenswrapper[4914]: I0130 21:30:39.770659 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert\") pod \"infra-operator-controller-manager-79955696d6-52vwl\" (UID: \"f96847c8-b695-44a5-8756-b2fe0da5e409\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:30:39 crc kubenswrapper[4914]: E0130 21:30:39.770927 4914 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 21:30:39 crc kubenswrapper[4914]: E0130 21:30:39.771272 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert podName:f96847c8-b695-44a5-8756-b2fe0da5e409 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:47.771237598 +0000 UTC m=+981.209874439 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert") pod "infra-operator-controller-manager-79955696d6-52vwl" (UID: "f96847c8-b695-44a5-8756-b2fe0da5e409") : secret "infra-operator-webhook-server-cert" not found Jan 30 21:30:40 crc kubenswrapper[4914]: I0130 21:30:40.176763 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj\" (UID: \"1d5fa522-5cf7-420f-be41-1d55eb8f1b2c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:30:40 crc kubenswrapper[4914]: E0130 21:30:40.177009 4914 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:30:40 crc kubenswrapper[4914]: E0130 21:30:40.177058 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert podName:1d5fa522-5cf7-420f-be41-1d55eb8f1b2c nodeName:}" failed. No retries permitted until 2026-01-30 21:30:48.177045046 +0000 UTC m=+981.615681807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" (UID: "1d5fa522-5cf7-420f-be41-1d55eb8f1b2c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:30:40 crc kubenswrapper[4914]: I0130 21:30:40.379092 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:40 crc kubenswrapper[4914]: I0130 21:30:40.379198 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:40 crc kubenswrapper[4914]: E0130 21:30:40.379364 4914 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 21:30:40 crc kubenswrapper[4914]: E0130 21:30:40.379417 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs podName:f2909dee-0316-4626-b532-ebdd66466638 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:48.379404042 +0000 UTC m=+981.818040803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs") pod "openstack-operator-controller-manager-7d48698d88-mch9s" (UID: "f2909dee-0316-4626-b532-ebdd66466638") : secret "webhook-server-cert" not found Jan 30 21:30:40 crc kubenswrapper[4914]: E0130 21:30:40.379653 4914 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 21:30:40 crc kubenswrapper[4914]: E0130 21:30:40.379798 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs podName:f2909dee-0316-4626-b532-ebdd66466638 nodeName:}" failed. No retries permitted until 2026-01-30 21:30:48.379775071 +0000 UTC m=+981.818411892 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs") pod "openstack-operator-controller-manager-7d48698d88-mch9s" (UID: "f2909dee-0316-4626-b532-ebdd66466638") : secret "metrics-server-cert" not found Jan 30 21:30:46 crc kubenswrapper[4914]: I0130 21:30:46.611110 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:46 crc kubenswrapper[4914]: I0130 21:30:46.611781 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:46 crc kubenswrapper[4914]: I0130 21:30:46.656217 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:47 crc kubenswrapper[4914]: I0130 21:30:47.047133 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4hvtw" Jan 30 21:30:47 crc kubenswrapper[4914]: I0130 21:30:47.115615 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hvtw"] Jan 30 21:30:47 crc kubenswrapper[4914]: I0130 21:30:47.159906 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ljlfr"] Jan 30 21:30:47 crc kubenswrapper[4914]: I0130 21:30:47.160182 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ljlfr" podUID="abd996f5-b265-4424-8c95-b670890abf5d" containerName="registry-server" containerID="cri-o://6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982" gracePeriod=2 Jan 30 21:30:47 crc kubenswrapper[4914]: I0130 21:30:47.793675 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert\") pod \"infra-operator-controller-manager-79955696d6-52vwl\" (UID: \"f96847c8-b695-44a5-8756-b2fe0da5e409\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:30:47 crc kubenswrapper[4914]: I0130 21:30:47.799584 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f96847c8-b695-44a5-8756-b2fe0da5e409-cert\") pod \"infra-operator-controller-manager-79955696d6-52vwl\" (UID: \"f96847c8-b695-44a5-8756-b2fe0da5e409\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:30:47 crc kubenswrapper[4914]: I0130 21:30:47.874604 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:30:48 crc kubenswrapper[4914]: I0130 21:30:48.036460 4914 generic.go:334] "Generic (PLEG): container finished" podID="abd996f5-b265-4424-8c95-b670890abf5d" containerID="6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982" exitCode=0 Jan 30 21:30:48 crc kubenswrapper[4914]: I0130 21:30:48.036541 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljlfr" event={"ID":"abd996f5-b265-4424-8c95-b670890abf5d","Type":"ContainerDied","Data":"6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982"} Jan 30 21:30:48 crc kubenswrapper[4914]: I0130 21:30:48.205382 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj\" (UID: \"1d5fa522-5cf7-420f-be41-1d55eb8f1b2c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:30:48 crc kubenswrapper[4914]: I0130 21:30:48.208883 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d5fa522-5cf7-420f-be41-1d55eb8f1b2c-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj\" (UID: \"1d5fa522-5cf7-420f-be41-1d55eb8f1b2c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:30:48 crc kubenswrapper[4914]: I0130 21:30:48.213072 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:30:48 crc kubenswrapper[4914]: I0130 21:30:48.408620 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:48 crc kubenswrapper[4914]: I0130 21:30:48.408718 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:48 crc kubenswrapper[4914]: E0130 21:30:48.408888 4914 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 21:30:48 crc kubenswrapper[4914]: E0130 21:30:48.408936 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs podName:f2909dee-0316-4626-b532-ebdd66466638 nodeName:}" failed. No retries permitted until 2026-01-30 21:31:04.408919685 +0000 UTC m=+997.847556436 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs") pod "openstack-operator-controller-manager-7d48698d88-mch9s" (UID: "f2909dee-0316-4626-b532-ebdd66466638") : secret "webhook-server-cert" not found Jan 30 21:30:48 crc kubenswrapper[4914]: I0130 21:30:48.413174 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:30:48 crc kubenswrapper[4914]: E0130 21:30:48.428314 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:379470e2752f286e73908e94233e884922b231169a5521a59f53843a2dc3184c" Jan 30 21:30:48 crc kubenswrapper[4914]: E0130 21:30:48.428512 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:379470e2752f286e73908e94233e884922b231169a5521a59f53843a2dc3184c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ldzrd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7b6c4d8c5f-9jf6b_openstack-operators(8c596723-7c41-448d-831e-07fa9d1129e9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:30:48 crc kubenswrapper[4914]: E0130 21:30:48.429875 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b" podUID="8c596723-7c41-448d-831e-07fa9d1129e9" Jan 30 21:30:49 crc kubenswrapper[4914]: E0130 21:30:49.042885 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:379470e2752f286e73908e94233e884922b231169a5521a59f53843a2dc3184c\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b" podUID="8c596723-7c41-448d-831e-07fa9d1129e9" Jan 30 21:30:49 crc kubenswrapper[4914]: E0130 21:30:49.188238 4914 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982 is running failed: container process not found" containerID="6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 21:30:49 crc kubenswrapper[4914]: E0130 21:30:49.188674 4914 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982 is running failed: container process not found" containerID="6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 21:30:49 crc kubenswrapper[4914]: E0130 21:30:49.189262 4914 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982 is running failed: container process not found" containerID="6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 21:30:49 crc kubenswrapper[4914]: E0130 21:30:49.189333 4914 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-ljlfr" podUID="abd996f5-b265-4424-8c95-b670890abf5d" containerName="registry-server" Jan 30 21:30:56 crc kubenswrapper[4914]: I0130 21:30:56.983118 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:30:56 crc kubenswrapper[4914]: I0130 21:30:56.983960 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:30:57 crc kubenswrapper[4914]: E0130 21:30:57.841252 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382" Jan 30 21:30:57 crc kubenswrapper[4914]: E0130 21:30:57.841989 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6rz7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-6d9697b7f4-54hzc_openstack-operators(7fa005ac-0d9e-4784-8558-df96b2d54006): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:30:57 crc kubenswrapper[4914]: E0130 21:30:57.843306 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc" podUID="7fa005ac-0d9e-4784-8558-df96b2d54006" Jan 30 21:30:58 crc kubenswrapper[4914]: E0130 21:30:58.116289 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382\\\"\"" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc" podUID="7fa005ac-0d9e-4784-8558-df96b2d54006" Jan 30 21:30:59 crc kubenswrapper[4914]: E0130 21:30:59.190935 4914 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982 is running failed: container process not found" containerID="6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 21:30:59 crc kubenswrapper[4914]: E0130 21:30:59.191737 4914 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982 is running failed: container process not found" containerID="6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 21:30:59 crc kubenswrapper[4914]: E0130 21:30:59.192579 4914 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982 is running failed: container process not found" containerID="6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982" cmd=["grpc_health_probe","-addr=:50051"] Jan 30 21:30:59 crc kubenswrapper[4914]: E0130 21:30:59.192669 4914 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-ljlfr" podUID="abd996f5-b265-4424-8c95-b670890abf5d" containerName="registry-server" Jan 30 21:30:59 crc kubenswrapper[4914]: E0130 21:30:59.426840 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e" Jan 30 21:30:59 crc kubenswrapper[4914]: E0130 21:30:59.427007 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lcf5v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-55bff696bd-td9qq_openstack-operators(54b1be48-0cbc-4e1a-be5a-bc6b4cf5df27): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:30:59 crc kubenswrapper[4914]: E0130 21:30:59.428205 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq" podUID="54b1be48-0cbc-4e1a-be5a-bc6b4cf5df27" Jan 30 21:30:59 crc kubenswrapper[4914]: E0130 21:30:59.447732 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 30 21:30:59 crc kubenswrapper[4914]: E0130 21:30:59.447928 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wfc4k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-5l8kc_openstack-operators(31234f15-4801-4840-95e7-e985b1d80aa5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:30:59 crc kubenswrapper[4914]: E0130 21:30:59.449103 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5l8kc" podUID="31234f15-4801-4840-95e7-e985b1d80aa5" Jan 30 21:31:00 crc kubenswrapper[4914]: E0130 21:31:00.131881 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5l8kc" podUID="31234f15-4801-4840-95e7-e985b1d80aa5" Jan 30 21:31:00 crc kubenswrapper[4914]: E0130 21:31:00.132134 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq" podUID="54b1be48-0cbc-4e1a-be5a-bc6b4cf5df27" Jan 30 21:31:00 crc kubenswrapper[4914]: E0130 21:31:00.252834 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17" Jan 30 21:31:00 crc kubenswrapper[4914]: E0130 21:31:00.253061 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r2fdw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-84f48565d4-qc9x5_openstack-operators(8419dc35-995b-43a3-82b7-6c2b7eb66d35): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:31:00 crc kubenswrapper[4914]: E0130 21:31:00.254250 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5" podUID="8419dc35-995b-43a3-82b7-6c2b7eb66d35" Jan 30 21:31:00 crc kubenswrapper[4914]: I0130 21:31:00.909044 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:31:00 crc kubenswrapper[4914]: I0130 21:31:00.912852 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abd996f5-b265-4424-8c95-b670890abf5d-utilities\") pod \"abd996f5-b265-4424-8c95-b670890abf5d\" (UID: \"abd996f5-b265-4424-8c95-b670890abf5d\") " Jan 30 21:31:00 crc kubenswrapper[4914]: I0130 21:31:00.912906 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx6tq\" (UniqueName: \"kubernetes.io/projected/abd996f5-b265-4424-8c95-b670890abf5d-kube-api-access-vx6tq\") pod \"abd996f5-b265-4424-8c95-b670890abf5d\" (UID: \"abd996f5-b265-4424-8c95-b670890abf5d\") " Jan 30 21:31:00 crc kubenswrapper[4914]: I0130 21:31:00.912933 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abd996f5-b265-4424-8c95-b670890abf5d-catalog-content\") pod \"abd996f5-b265-4424-8c95-b670890abf5d\" (UID: \"abd996f5-b265-4424-8c95-b670890abf5d\") " Jan 30 21:31:00 crc kubenswrapper[4914]: I0130 21:31:00.914125 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abd996f5-b265-4424-8c95-b670890abf5d-utilities" (OuterVolumeSpecName: "utilities") pod "abd996f5-b265-4424-8c95-b670890abf5d" (UID: "abd996f5-b265-4424-8c95-b670890abf5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:31:00 crc kubenswrapper[4914]: I0130 21:31:00.925277 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abd996f5-b265-4424-8c95-b670890abf5d-kube-api-access-vx6tq" (OuterVolumeSpecName: "kube-api-access-vx6tq") pod "abd996f5-b265-4424-8c95-b670890abf5d" (UID: "abd996f5-b265-4424-8c95-b670890abf5d"). InnerVolumeSpecName "kube-api-access-vx6tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:31:00 crc kubenswrapper[4914]: I0130 21:31:00.973346 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abd996f5-b265-4424-8c95-b670890abf5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abd996f5-b265-4424-8c95-b670890abf5d" (UID: "abd996f5-b265-4424-8c95-b670890abf5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:31:01 crc kubenswrapper[4914]: I0130 21:31:01.013790 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abd996f5-b265-4424-8c95-b670890abf5d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:01 crc kubenswrapper[4914]: I0130 21:31:01.013843 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx6tq\" (UniqueName: \"kubernetes.io/projected/abd996f5-b265-4424-8c95-b670890abf5d-kube-api-access-vx6tq\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:01 crc kubenswrapper[4914]: I0130 21:31:01.013854 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abd996f5-b265-4424-8c95-b670890abf5d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:01 crc kubenswrapper[4914]: I0130 21:31:01.138423 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljlfr" event={"ID":"abd996f5-b265-4424-8c95-b670890abf5d","Type":"ContainerDied","Data":"d3ea73800d1edc80b30b92449f8ce2c196a8644837c2e9cc17edbd277d570952"} Jan 30 21:31:01 crc kubenswrapper[4914]: I0130 21:31:01.138470 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljlfr" Jan 30 21:31:01 crc kubenswrapper[4914]: I0130 21:31:01.138504 4914 scope.go:117] "RemoveContainer" containerID="6f521c250e901883dc75d57b02fc5c7aab8d9197dc7c1c7e49610bf3fe661982" Jan 30 21:31:01 crc kubenswrapper[4914]: E0130 21:31:01.140010 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5" podUID="8419dc35-995b-43a3-82b7-6c2b7eb66d35" Jan 30 21:31:01 crc kubenswrapper[4914]: I0130 21:31:01.195751 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ljlfr"] Jan 30 21:31:01 crc kubenswrapper[4914]: I0130 21:31:01.203040 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ljlfr"] Jan 30 21:31:01 crc kubenswrapper[4914]: I0130 21:31:01.826663 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abd996f5-b265-4424-8c95-b670890abf5d" path="/var/lib/kubelet/pods/abd996f5-b265-4424-8c95-b670890abf5d/volumes" Jan 30 21:31:04 crc kubenswrapper[4914]: I0130 21:31:04.451838 4914 scope.go:117] "RemoveContainer" containerID="07716fdb83526288e3c9b7ec9bf39a021663cde022f38e2b6faa0a2277d28c76" Jan 30 21:31:04 crc kubenswrapper[4914]: I0130 21:31:04.463604 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:31:04 crc kubenswrapper[4914]: I0130 21:31:04.469856 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f2909dee-0316-4626-b532-ebdd66466638-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-mch9s\" (UID: \"f2909dee-0316-4626-b532-ebdd66466638\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:31:04 crc kubenswrapper[4914]: I0130 21:31:04.570472 4914 scope.go:117] "RemoveContainer" containerID="6376db8c6b18e9cf7b3d8bbb043e255a6c7235b5370cf1b846979ea634cb9691" Jan 30 21:31:04 crc kubenswrapper[4914]: I0130 21:31:04.648097 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:31:04 crc kubenswrapper[4914]: I0130 21:31:04.839728 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-52vwl"] Jan 30 21:31:04 crc kubenswrapper[4914]: I0130 21:31:04.953665 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj"] Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.217247 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gltkz" event={"ID":"ccefb276-816b-481e-a645-2bc8b4619d7c","Type":"ContainerStarted","Data":"07e6a5ce5c2db4f5aa572d72556f3b809d6f15f5578b91167642db737869e927"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.218470 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gltkz" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.232999 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" event={"ID":"1d5fa522-5cf7-420f-be41-1d55eb8f1b2c","Type":"ContainerStarted","Data":"398591d86b89dbe2fe38491aadc52f876bc7282d183eaa62cce33f5c288fcc99"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.244253 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-th2ws" event={"ID":"8a629cdc-714c-442c-90ae-d20b15d257c6","Type":"ContainerStarted","Data":"51f9849fdca765243740873ab6bf981a626086c23253dc69e1979f73544d3dcf"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.245014 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-th2ws" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.264424 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv" event={"ID":"f23972f8-fa47-444a-b26a-f02086d4f186","Type":"ContainerStarted","Data":"9a6ecfdf18516aedea3a11a34a5879c046028704afdf71414b850f76d80e838f"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.265132 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.277262 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s"] Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.287138 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p" event={"ID":"9a7f7899-f35e-4fef-ba51-82af970498db","Type":"ContainerStarted","Data":"05800d4ecfc95734ec00c5f738d5fb348d9b66f1856b0047d293884d97ea8955"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.287572 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.305777 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t8sfg" event={"ID":"d9414032-0155-4d4b-b456-02ee0f3f4185","Type":"ContainerStarted","Data":"568634f311dc895498c59e6771a5ca38fd8792e2455fdd0605b520f72b63af69"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.306422 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t8sfg" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.343956 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bmgvt" event={"ID":"91003d71-490f-458c-94e9-d8957d6eaac9","Type":"ContainerStarted","Data":"2537df4ea20206f4b91e358cc197424e3e9c63a33566cee21e73bee2edca57ba"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.344607 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bmgvt" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.373384 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gltkz" podStartSLOduration=9.318064271 podStartE2EDuration="34.373367552s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.792188513 +0000 UTC m=+969.230825274" lastFinishedPulling="2026-01-30 21:31:00.847491794 +0000 UTC m=+994.286128555" observedRunningTime="2026-01-30 21:31:05.319146746 +0000 UTC m=+998.757783507" watchObservedRunningTime="2026-01-30 21:31:05.373367552 +0000 UTC m=+998.812004313" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.406102 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-th2ws" podStartSLOduration=8.462763857 podStartE2EDuration="33.406081974s" podCreationTimestamp="2026-01-30 21:30:32 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.903664334 +0000 UTC m=+969.342301095" lastFinishedPulling="2026-01-30 21:31:00.846982451 +0000 UTC m=+994.285619212" observedRunningTime="2026-01-30 21:31:05.370245237 +0000 UTC m=+998.808881988" watchObservedRunningTime="2026-01-30 21:31:05.406081974 +0000 UTC m=+998.844718725" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.419003 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j" event={"ID":"b8e1305a-5b6a-45f3-a228-b16259431de5","Type":"ContainerStarted","Data":"00d2131cc66bffbb39b3dbe884f4eb15b39062bedd05aa62c40bb2e27a31f2e3"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.420115 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.420925 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p" podStartSLOduration=5.913313224 podStartE2EDuration="34.420909328s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.97055872 +0000 UTC m=+969.409195481" lastFinishedPulling="2026-01-30 21:31:04.478154814 +0000 UTC m=+997.916791585" observedRunningTime="2026-01-30 21:31:05.420412977 +0000 UTC m=+998.859049738" watchObservedRunningTime="2026-01-30 21:31:05.420909328 +0000 UTC m=+998.859546089" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.438915 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-kjw6v" event={"ID":"aa6ad36a-4244-490c-970b-52b03b3c3821","Type":"ContainerStarted","Data":"62c813d43eb5a94a644ec5605fb7276423833c356fbf02ec269bb27d2b30ad69"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.439537 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-kjw6v" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.456048 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-hrq78" event={"ID":"c6d2cebc-7c79-407e-8f69-6b93ab2b41b7","Type":"ContainerStarted","Data":"92b9f482dda1c9b71c028b030959c7992fdf464d12c929fd33b3b3ca30279536"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.456769 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-hrq78" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.463256 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t8sfg" podStartSLOduration=9.40921766 podStartE2EDuration="34.4632401s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.791941387 +0000 UTC m=+969.230578148" lastFinishedPulling="2026-01-30 21:31:00.845963827 +0000 UTC m=+994.284600588" observedRunningTime="2026-01-30 21:31:05.461168461 +0000 UTC m=+998.899805222" watchObservedRunningTime="2026-01-30 21:31:05.4632401 +0000 UTC m=+998.901876861" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.468890 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78" event={"ID":"71273dcf-3d97-4a71-9b09-d8261da90f73","Type":"ContainerStarted","Data":"8764427b5028223af83321ee433d6d1eb4e121e3d3e17f5536bb95ed89de88c7"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.469511 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.479946 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj" event={"ID":"1746db5a-3b9a-4d76-b1f3-845b907ccabc","Type":"ContainerStarted","Data":"86709a96c33761eba0269e7f57e5055e7544deed416191f7f178cd8d682c0f04"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.480684 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.495736 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-648g4" event={"ID":"0f009ea3-601c-4c2c-bbf4-d300abfe1100","Type":"ContainerStarted","Data":"bbed8f279cfed630cded8d580eb79e7638e429f3e874f455d6a552ca95450c04"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.496359 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-648g4" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.509065 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" event={"ID":"f96847c8-b695-44a5-8756-b2fe0da5e409","Type":"ContainerStarted","Data":"45b5e7c58d2ab087a25a9c7d436232e8919713c3dbfd631aa49e7b5478527887"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.526004 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6" event={"ID":"d505f587-6893-495d-99a0-acbdee4442df","Type":"ContainerStarted","Data":"db9d7ba8053567bb85dc2cda72b78691ea57a746631d44783d0d1ee9f66127bd"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.526673 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.541901 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7" event={"ID":"3330ea8a-466e-4ba5-ad5e-dcb7859521b0","Type":"ContainerStarted","Data":"6d33c2828cfcb55f8aae244039a2cb1a9f5ddd0183a4d69ba030c57e0ab26a0f"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.542512 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.556867 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zdp5q" event={"ID":"eddacba3-772b-4f09-acfe-60f6c56ba39c","Type":"ContainerStarted","Data":"1ff769e09e4304f6cc3dec562abb3535e95e6c06ed80331cdf5d7d1b80b8a88e"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.557492 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zdp5q" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.578933 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b" event={"ID":"8c596723-7c41-448d-831e-07fa9d1129e9","Type":"ContainerStarted","Data":"3f494d6e379a2abb00d3d2062a9ba4101fffd149f0563c0baa8c239c0cda19e4"} Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.579237 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv" podStartSLOduration=6.039044164 podStartE2EDuration="34.579221042s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.953241081 +0000 UTC m=+969.391877842" lastFinishedPulling="2026-01-30 21:31:04.493417949 +0000 UTC m=+997.932054720" observedRunningTime="2026-01-30 21:31:05.527013885 +0000 UTC m=+998.965650646" watchObservedRunningTime="2026-01-30 21:31:05.579221042 +0000 UTC m=+999.017857803" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.579561 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.582379 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj" podStartSLOduration=6.061896602 podStartE2EDuration="34.582370698s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.957380091 +0000 UTC m=+969.396016852" lastFinishedPulling="2026-01-30 21:31:04.477854187 +0000 UTC m=+997.916490948" observedRunningTime="2026-01-30 21:31:05.579005077 +0000 UTC m=+999.017641838" watchObservedRunningTime="2026-01-30 21:31:05.582370698 +0000 UTC m=+999.021007459" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.796962 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-648g4" podStartSLOduration=9.85030939 podStartE2EDuration="34.796946517s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.900417126 +0000 UTC m=+969.339053887" lastFinishedPulling="2026-01-30 21:31:00.847054263 +0000 UTC m=+994.285691014" observedRunningTime="2026-01-30 21:31:05.718960083 +0000 UTC m=+999.157596844" watchObservedRunningTime="2026-01-30 21:31:05.796946517 +0000 UTC m=+999.235583278" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.877073 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bmgvt" podStartSLOduration=9.379948083 podStartE2EDuration="34.877054601s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.349536815 +0000 UTC m=+968.788173576" lastFinishedPulling="2026-01-30 21:31:00.846643333 +0000 UTC m=+994.285280094" observedRunningTime="2026-01-30 21:31:05.801938356 +0000 UTC m=+999.240575117" watchObservedRunningTime="2026-01-30 21:31:05.877054601 +0000 UTC m=+999.315691362" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.878776 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j" podStartSLOduration=5.334640504 podStartE2EDuration="33.878771902s" podCreationTimestamp="2026-01-30 21:30:32 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.968120361 +0000 UTC m=+969.406757122" lastFinishedPulling="2026-01-30 21:31:04.512251749 +0000 UTC m=+997.950888520" observedRunningTime="2026-01-30 21:31:05.876910968 +0000 UTC m=+999.315547729" watchObservedRunningTime="2026-01-30 21:31:05.878771902 +0000 UTC m=+999.317408663" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.922437 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-hrq78" podStartSLOduration=9.868848036 podStartE2EDuration="34.922420126s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.792376447 +0000 UTC m=+969.231013208" lastFinishedPulling="2026-01-30 21:31:00.845948537 +0000 UTC m=+994.284585298" observedRunningTime="2026-01-30 21:31:05.920885729 +0000 UTC m=+999.359522490" watchObservedRunningTime="2026-01-30 21:31:05.922420126 +0000 UTC m=+999.361056887" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.960648 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6" podStartSLOduration=6.424518948 podStartE2EDuration="34.960627409s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.950117516 +0000 UTC m=+969.388754277" lastFinishedPulling="2026-01-30 21:31:04.486225977 +0000 UTC m=+997.924862738" observedRunningTime="2026-01-30 21:31:05.953903348 +0000 UTC m=+999.392540119" watchObservedRunningTime="2026-01-30 21:31:05.960627409 +0000 UTC m=+999.399264170" Jan 30 21:31:05 crc kubenswrapper[4914]: I0130 21:31:05.984838 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78" podStartSLOduration=6.458157451 podStartE2EDuration="34.984820367s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.950385982 +0000 UTC m=+969.389022743" lastFinishedPulling="2026-01-30 21:31:04.477048898 +0000 UTC m=+997.915685659" observedRunningTime="2026-01-30 21:31:05.982279766 +0000 UTC m=+999.420916537" watchObservedRunningTime="2026-01-30 21:31:05.984820367 +0000 UTC m=+999.423457128" Jan 30 21:31:06 crc kubenswrapper[4914]: I0130 21:31:06.013300 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-kjw6v" podStartSLOduration=10.11511343 podStartE2EDuration="35.013282947s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.333855706 +0000 UTC m=+968.772492477" lastFinishedPulling="2026-01-30 21:31:00.232025233 +0000 UTC m=+993.670661994" observedRunningTime="2026-01-30 21:31:06.009063597 +0000 UTC m=+999.447700358" watchObservedRunningTime="2026-01-30 21:31:06.013282947 +0000 UTC m=+999.451919708" Jan 30 21:31:06 crc kubenswrapper[4914]: I0130 21:31:06.079886 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b" podStartSLOduration=5.608888412 podStartE2EDuration="35.079868059s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.005004955 +0000 UTC m=+968.443641716" lastFinishedPulling="2026-01-30 21:31:04.475984602 +0000 UTC m=+997.914621363" observedRunningTime="2026-01-30 21:31:06.079188193 +0000 UTC m=+999.517824954" watchObservedRunningTime="2026-01-30 21:31:06.079868059 +0000 UTC m=+999.518504820" Jan 30 21:31:06 crc kubenswrapper[4914]: I0130 21:31:06.079982 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zdp5q" podStartSLOduration=8.831190059 podStartE2EDuration="34.079978032s" podCreationTimestamp="2026-01-30 21:30:32 +0000 UTC" firstStartedPulling="2026-01-30 21:30:34.98323863 +0000 UTC m=+968.421875381" lastFinishedPulling="2026-01-30 21:31:00.232026593 +0000 UTC m=+993.670663354" observedRunningTime="2026-01-30 21:31:06.052815882 +0000 UTC m=+999.491452643" watchObservedRunningTime="2026-01-30 21:31:06.079978032 +0000 UTC m=+999.518614793" Jan 30 21:31:06 crc kubenswrapper[4914]: I0130 21:31:06.100627 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7" podStartSLOduration=6.560654263 podStartE2EDuration="35.100611435s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.953459177 +0000 UTC m=+969.392095938" lastFinishedPulling="2026-01-30 21:31:04.493416349 +0000 UTC m=+997.932053110" observedRunningTime="2026-01-30 21:31:06.096278751 +0000 UTC m=+999.534915512" watchObservedRunningTime="2026-01-30 21:31:06.100611435 +0000 UTC m=+999.539248196" Jan 30 21:31:06 crc kubenswrapper[4914]: I0130 21:31:06.588490 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" event={"ID":"f2909dee-0316-4626-b532-ebdd66466638","Type":"ContainerStarted","Data":"ecf40c2570ef4230c86fea6a41734d8177d6d71f56854af600a249ee0c0fd5ba"} Jan 30 21:31:06 crc kubenswrapper[4914]: I0130 21:31:06.588528 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" event={"ID":"f2909dee-0316-4626-b532-ebdd66466638","Type":"ContainerStarted","Data":"0111ddd5ba8e89e109678aab138503619595f2c95e32abc2bff53c65353a13ee"} Jan 30 21:31:06 crc kubenswrapper[4914]: I0130 21:31:06.591209 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:31:06 crc kubenswrapper[4914]: I0130 21:31:06.627724 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" podStartSLOduration=34.627685903 podStartE2EDuration="34.627685903s" podCreationTimestamp="2026-01-30 21:30:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:31:06.620117602 +0000 UTC m=+1000.058754363" watchObservedRunningTime="2026-01-30 21:31:06.627685903 +0000 UTC m=+1000.066322664" Jan 30 21:31:08 crc kubenswrapper[4914]: I0130 21:31:08.611692 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" event={"ID":"1d5fa522-5cf7-420f-be41-1d55eb8f1b2c","Type":"ContainerStarted","Data":"6f39180a3396cefa88b6e15e95ff90847afd5320f9d65e6d54a39f08433f67a2"} Jan 30 21:31:08 crc kubenswrapper[4914]: I0130 21:31:08.613356 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" event={"ID":"f96847c8-b695-44a5-8756-b2fe0da5e409","Type":"ContainerStarted","Data":"94921bf752880f53e97cf10902f5534415ac3ca3b96046c1b74c185b9d1cf6cd"} Jan 30 21:31:08 crc kubenswrapper[4914]: I0130 21:31:08.613525 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:31:08 crc kubenswrapper[4914]: I0130 21:31:08.654949 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" podStartSLOduration=34.379486878 podStartE2EDuration="37.654927278s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:31:04.987055088 +0000 UTC m=+998.425691849" lastFinishedPulling="2026-01-30 21:31:08.262495458 +0000 UTC m=+1001.701132249" observedRunningTime="2026-01-30 21:31:08.646618089 +0000 UTC m=+1002.085254850" watchObservedRunningTime="2026-01-30 21:31:08.654927278 +0000 UTC m=+1002.093564049" Jan 30 21:31:08 crc kubenswrapper[4914]: I0130 21:31:08.672087 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" podStartSLOduration=34.30758621 podStartE2EDuration="37.672067588s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:31:04.901154045 +0000 UTC m=+998.339790806" lastFinishedPulling="2026-01-30 21:31:08.265635423 +0000 UTC m=+1001.704272184" observedRunningTime="2026-01-30 21:31:08.67132346 +0000 UTC m=+1002.109960231" watchObservedRunningTime="2026-01-30 21:31:08.672067588 +0000 UTC m=+1002.110704349" Jan 30 21:31:09 crc kubenswrapper[4914]: I0130 21:31:09.625780 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc" event={"ID":"7fa005ac-0d9e-4784-8558-df96b2d54006","Type":"ContainerStarted","Data":"be38e46c5c5571f811bed576de2b4d758e4d0d1929c5e6ca812dbb7dd995f34d"} Jan 30 21:31:09 crc kubenswrapper[4914]: I0130 21:31:09.626170 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:31:09 crc kubenswrapper[4914]: I0130 21:31:09.656036 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc" podStartSLOduration=5.226116361 podStartE2EDuration="38.656012696s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.942773299 +0000 UTC m=+969.381410060" lastFinishedPulling="2026-01-30 21:31:09.372669624 +0000 UTC m=+1002.811306395" observedRunningTime="2026-01-30 21:31:09.645785032 +0000 UTC m=+1003.084421833" watchObservedRunningTime="2026-01-30 21:31:09.656012696 +0000 UTC m=+1003.094649487" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.024161 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-9jf6b" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.072942 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.081401 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-t225p" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.146205 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-hrq78" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.193132 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-qz2j7" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.234440 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vvdgj" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.365218 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bmgvt" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.383304 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-648g4" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.389772 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-kjw6v" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.405533 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-wg7x6" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.568638 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-lxv78" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.602606 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-t8sfg" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.676181 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-wknxv" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.727419 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gltkz" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.766898 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-rdc4j" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.800370 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zdp5q" Jan 30 21:31:12 crc kubenswrapper[4914]: I0130 21:31:12.820321 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-th2ws" Jan 30 21:31:13 crc kubenswrapper[4914]: I0130 21:31:13.657532 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5l8kc" event={"ID":"31234f15-4801-4840-95e7-e985b1d80aa5","Type":"ContainerStarted","Data":"202a3130b57b60b58f1e4f93712494d7e42843416aeb158e00717bb44e1a7547"} Jan 30 21:31:13 crc kubenswrapper[4914]: I0130 21:31:13.681407 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5l8kc" podStartSLOduration=4.15150418 podStartE2EDuration="41.681383541s" podCreationTimestamp="2026-01-30 21:30:32 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.913460791 +0000 UTC m=+969.352097552" lastFinishedPulling="2026-01-30 21:31:13.443340142 +0000 UTC m=+1006.881976913" observedRunningTime="2026-01-30 21:31:13.674520957 +0000 UTC m=+1007.113157728" watchObservedRunningTime="2026-01-30 21:31:13.681383541 +0000 UTC m=+1007.120020302" Jan 30 21:31:14 crc kubenswrapper[4914]: I0130 21:31:14.653948 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-mch9s" Jan 30 21:31:14 crc kubenswrapper[4914]: I0130 21:31:14.665820 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq" event={"ID":"54b1be48-0cbc-4e1a-be5a-bc6b4cf5df27","Type":"ContainerStarted","Data":"2ba88305a6231c0e296b65116372ee1e09dff3db517922678088c3856c2e29b0"} Jan 30 21:31:14 crc kubenswrapper[4914]: I0130 21:31:14.666232 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq" Jan 30 21:31:14 crc kubenswrapper[4914]: I0130 21:31:14.725643 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq" podStartSLOduration=6.00052739 podStartE2EDuration="43.72562238s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.942790099 +0000 UTC m=+969.381426860" lastFinishedPulling="2026-01-30 21:31:13.667885089 +0000 UTC m=+1007.106521850" observedRunningTime="2026-01-30 21:31:14.717022545 +0000 UTC m=+1008.155659316" watchObservedRunningTime="2026-01-30 21:31:14.72562238 +0000 UTC m=+1008.164259151" Jan 30 21:31:16 crc kubenswrapper[4914]: I0130 21:31:16.690964 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5" event={"ID":"8419dc35-995b-43a3-82b7-6c2b7eb66d35","Type":"ContainerStarted","Data":"c6fbf6bbc44749ec9c927fe4ebe796524e2d09e021efa66834adad842679df4d"} Jan 30 21:31:16 crc kubenswrapper[4914]: I0130 21:31:16.691445 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5" Jan 30 21:31:16 crc kubenswrapper[4914]: I0130 21:31:16.707223 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5" podStartSLOduration=5.256092543 podStartE2EDuration="45.707202144s" podCreationTimestamp="2026-01-30 21:30:31 +0000 UTC" firstStartedPulling="2026-01-30 21:30:35.886027228 +0000 UTC m=+969.324663999" lastFinishedPulling="2026-01-30 21:31:16.337136849 +0000 UTC m=+1009.775773600" observedRunningTime="2026-01-30 21:31:16.706332424 +0000 UTC m=+1010.144969195" watchObservedRunningTime="2026-01-30 21:31:16.707202144 +0000 UTC m=+1010.145838915" Jan 30 21:31:17 crc kubenswrapper[4914]: I0130 21:31:17.886788 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-52vwl" Jan 30 21:31:18 crc kubenswrapper[4914]: I0130 21:31:18.222859 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj" Jan 30 21:31:22 crc kubenswrapper[4914]: I0130 21:31:22.076302 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-54hzc" Jan 30 21:31:22 crc kubenswrapper[4914]: I0130 21:31:22.298847 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-qc9x5" Jan 30 21:31:22 crc kubenswrapper[4914]: I0130 21:31:22.493586 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-td9qq" Jan 30 21:31:26 crc kubenswrapper[4914]: I0130 21:31:26.983531 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:31:26 crc kubenswrapper[4914]: I0130 21:31:26.983948 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.295144 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-srgc7"] Jan 30 21:31:43 crc kubenswrapper[4914]: E0130 21:31:43.295786 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd996f5-b265-4424-8c95-b670890abf5d" containerName="extract-utilities" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.295798 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd996f5-b265-4424-8c95-b670890abf5d" containerName="extract-utilities" Jan 30 21:31:43 crc kubenswrapper[4914]: E0130 21:31:43.295812 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd996f5-b265-4424-8c95-b670890abf5d" containerName="extract-content" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.295818 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd996f5-b265-4424-8c95-b670890abf5d" containerName="extract-content" Jan 30 21:31:43 crc kubenswrapper[4914]: E0130 21:31:43.295830 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd996f5-b265-4424-8c95-b670890abf5d" containerName="registry-server" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.295837 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd996f5-b265-4424-8c95-b670890abf5d" containerName="registry-server" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.295976 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="abd996f5-b265-4424-8c95-b670890abf5d" containerName="registry-server" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.296631 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-srgc7" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.303191 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-fpdjq" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.303574 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.303732 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.305295 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.318067 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-srgc7"] Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.351528 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-m99ms"] Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.353143 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.359582 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.366941 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-m99ms"] Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.412883 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f117df6e-671b-4401-8056-0d094bf65b8b-config\") pod \"dnsmasq-dns-78dd6ddcc-m99ms\" (UID: \"f117df6e-671b-4401-8056-0d094bf65b8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.412956 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsnzc\" (UniqueName: \"kubernetes.io/projected/eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d-kube-api-access-xsnzc\") pod \"dnsmasq-dns-675f4bcbfc-srgc7\" (UID: \"eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-srgc7" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.413023 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f117df6e-671b-4401-8056-0d094bf65b8b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-m99ms\" (UID: \"f117df6e-671b-4401-8056-0d094bf65b8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.413045 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5k9p\" (UniqueName: \"kubernetes.io/projected/f117df6e-671b-4401-8056-0d094bf65b8b-kube-api-access-l5k9p\") pod \"dnsmasq-dns-78dd6ddcc-m99ms\" (UID: \"f117df6e-671b-4401-8056-0d094bf65b8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.413073 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d-config\") pod \"dnsmasq-dns-675f4bcbfc-srgc7\" (UID: \"eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-srgc7" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.514602 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f117df6e-671b-4401-8056-0d094bf65b8b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-m99ms\" (UID: \"f117df6e-671b-4401-8056-0d094bf65b8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.514639 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5k9p\" (UniqueName: \"kubernetes.io/projected/f117df6e-671b-4401-8056-0d094bf65b8b-kube-api-access-l5k9p\") pod \"dnsmasq-dns-78dd6ddcc-m99ms\" (UID: \"f117df6e-671b-4401-8056-0d094bf65b8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.514663 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d-config\") pod \"dnsmasq-dns-675f4bcbfc-srgc7\" (UID: \"eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-srgc7" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.514732 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f117df6e-671b-4401-8056-0d094bf65b8b-config\") pod \"dnsmasq-dns-78dd6ddcc-m99ms\" (UID: \"f117df6e-671b-4401-8056-0d094bf65b8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.514768 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsnzc\" (UniqueName: \"kubernetes.io/projected/eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d-kube-api-access-xsnzc\") pod \"dnsmasq-dns-675f4bcbfc-srgc7\" (UID: \"eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-srgc7" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.515754 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d-config\") pod \"dnsmasq-dns-675f4bcbfc-srgc7\" (UID: \"eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-srgc7" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.515858 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f117df6e-671b-4401-8056-0d094bf65b8b-config\") pod \"dnsmasq-dns-78dd6ddcc-m99ms\" (UID: \"f117df6e-671b-4401-8056-0d094bf65b8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.515867 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f117df6e-671b-4401-8056-0d094bf65b8b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-m99ms\" (UID: \"f117df6e-671b-4401-8056-0d094bf65b8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.533061 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsnzc\" (UniqueName: \"kubernetes.io/projected/eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d-kube-api-access-xsnzc\") pod \"dnsmasq-dns-675f4bcbfc-srgc7\" (UID: \"eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-srgc7" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.535747 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5k9p\" (UniqueName: \"kubernetes.io/projected/f117df6e-671b-4401-8056-0d094bf65b8b-kube-api-access-l5k9p\") pod \"dnsmasq-dns-78dd6ddcc-m99ms\" (UID: \"f117df6e-671b-4401-8056-0d094bf65b8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.617041 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-srgc7" Jan 30 21:31:43 crc kubenswrapper[4914]: I0130 21:31:43.675064 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" Jan 30 21:31:44 crc kubenswrapper[4914]: I0130 21:31:44.077579 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-srgc7"] Jan 30 21:31:44 crc kubenswrapper[4914]: I0130 21:31:44.172797 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-m99ms"] Jan 30 21:31:44 crc kubenswrapper[4914]: W0130 21:31:44.176438 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf117df6e_671b_4401_8056_0d094bf65b8b.slice/crio-92fb36db52ab126ee5601ecc4b10af018e6112617c5a51c7aa4999947d2ce3ae WatchSource:0}: Error finding container 92fb36db52ab126ee5601ecc4b10af018e6112617c5a51c7aa4999947d2ce3ae: Status 404 returned error can't find the container with id 92fb36db52ab126ee5601ecc4b10af018e6112617c5a51c7aa4999947d2ce3ae Jan 30 21:31:44 crc kubenswrapper[4914]: I0130 21:31:44.953944 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" event={"ID":"f117df6e-671b-4401-8056-0d094bf65b8b","Type":"ContainerStarted","Data":"92fb36db52ab126ee5601ecc4b10af018e6112617c5a51c7aa4999947d2ce3ae"} Jan 30 21:31:44 crc kubenswrapper[4914]: I0130 21:31:44.954925 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-srgc7" event={"ID":"eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d","Type":"ContainerStarted","Data":"6892df0d5b5d20e59555e60013bd63322529784f4826473504594e7fc1a0ce8d"} Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.205648 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-srgc7"] Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.226645 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7jjwv"] Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.227742 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.239310 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7jjwv"] Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.254414 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9f2445-e517-4f92-a54e-6008fc190663-dns-svc\") pod \"dnsmasq-dns-666b6646f7-7jjwv\" (UID: \"fa9f2445-e517-4f92-a54e-6008fc190663\") " pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.254461 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9f2445-e517-4f92-a54e-6008fc190663-config\") pod \"dnsmasq-dns-666b6646f7-7jjwv\" (UID: \"fa9f2445-e517-4f92-a54e-6008fc190663\") " pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.254485 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqs6b\" (UniqueName: \"kubernetes.io/projected/fa9f2445-e517-4f92-a54e-6008fc190663-kube-api-access-cqs6b\") pod \"dnsmasq-dns-666b6646f7-7jjwv\" (UID: \"fa9f2445-e517-4f92-a54e-6008fc190663\") " pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.356091 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9f2445-e517-4f92-a54e-6008fc190663-dns-svc\") pod \"dnsmasq-dns-666b6646f7-7jjwv\" (UID: \"fa9f2445-e517-4f92-a54e-6008fc190663\") " pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.356137 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9f2445-e517-4f92-a54e-6008fc190663-config\") pod \"dnsmasq-dns-666b6646f7-7jjwv\" (UID: \"fa9f2445-e517-4f92-a54e-6008fc190663\") " pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.356176 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqs6b\" (UniqueName: \"kubernetes.io/projected/fa9f2445-e517-4f92-a54e-6008fc190663-kube-api-access-cqs6b\") pod \"dnsmasq-dns-666b6646f7-7jjwv\" (UID: \"fa9f2445-e517-4f92-a54e-6008fc190663\") " pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.357028 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9f2445-e517-4f92-a54e-6008fc190663-dns-svc\") pod \"dnsmasq-dns-666b6646f7-7jjwv\" (UID: \"fa9f2445-e517-4f92-a54e-6008fc190663\") " pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.357371 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9f2445-e517-4f92-a54e-6008fc190663-config\") pod \"dnsmasq-dns-666b6646f7-7jjwv\" (UID: \"fa9f2445-e517-4f92-a54e-6008fc190663\") " pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.382483 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqs6b\" (UniqueName: \"kubernetes.io/projected/fa9f2445-e517-4f92-a54e-6008fc190663-kube-api-access-cqs6b\") pod \"dnsmasq-dns-666b6646f7-7jjwv\" (UID: \"fa9f2445-e517-4f92-a54e-6008fc190663\") " pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.476420 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-m99ms"] Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.500312 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rnjqw"] Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.501456 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.523328 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rnjqw"] Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.551952 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.559575 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6edeafb-6617-4058-9b35-bf0bb078ceba-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rnjqw\" (UID: \"a6edeafb-6617-4058-9b35-bf0bb078ceba\") " pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.559633 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6edeafb-6617-4058-9b35-bf0bb078ceba-config\") pod \"dnsmasq-dns-57d769cc4f-rnjqw\" (UID: \"a6edeafb-6617-4058-9b35-bf0bb078ceba\") " pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.559671 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-658x5\" (UniqueName: \"kubernetes.io/projected/a6edeafb-6617-4058-9b35-bf0bb078ceba-kube-api-access-658x5\") pod \"dnsmasq-dns-57d769cc4f-rnjqw\" (UID: \"a6edeafb-6617-4058-9b35-bf0bb078ceba\") " pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.661407 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6edeafb-6617-4058-9b35-bf0bb078ceba-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rnjqw\" (UID: \"a6edeafb-6617-4058-9b35-bf0bb078ceba\") " pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.661493 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6edeafb-6617-4058-9b35-bf0bb078ceba-config\") pod \"dnsmasq-dns-57d769cc4f-rnjqw\" (UID: \"a6edeafb-6617-4058-9b35-bf0bb078ceba\") " pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.661559 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-658x5\" (UniqueName: \"kubernetes.io/projected/a6edeafb-6617-4058-9b35-bf0bb078ceba-kube-api-access-658x5\") pod \"dnsmasq-dns-57d769cc4f-rnjqw\" (UID: \"a6edeafb-6617-4058-9b35-bf0bb078ceba\") " pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.662309 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6edeafb-6617-4058-9b35-bf0bb078ceba-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rnjqw\" (UID: \"a6edeafb-6617-4058-9b35-bf0bb078ceba\") " pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.662396 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6edeafb-6617-4058-9b35-bf0bb078ceba-config\") pod \"dnsmasq-dns-57d769cc4f-rnjqw\" (UID: \"a6edeafb-6617-4058-9b35-bf0bb078ceba\") " pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.683173 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-658x5\" (UniqueName: \"kubernetes.io/projected/a6edeafb-6617-4058-9b35-bf0bb078ceba-kube-api-access-658x5\") pod \"dnsmasq-dns-57d769cc4f-rnjqw\" (UID: \"a6edeafb-6617-4058-9b35-bf0bb078ceba\") " pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:31:46 crc kubenswrapper[4914]: I0130 21:31:46.820172 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.184880 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7jjwv"] Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.387534 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.389733 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.391795 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ldxml" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.392171 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.392247 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.392328 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.392412 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.392464 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.392530 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.410579 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.479166 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.479230 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-25113a91-49b5-491c-99f4-7569d427709f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.479277 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-config-data\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.479306 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.479336 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.479367 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.479390 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.479413 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.479432 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.479453 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.479481 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2f7h\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-kube-api-access-f2f7h\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.580988 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-config-data\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.581830 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.581863 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.581886 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.581907 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.581931 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.581950 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.581968 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.581998 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2f7h\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-kube-api-access-f2f7h\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.582020 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.582047 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-25113a91-49b5-491c-99f4-7569d427709f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.584928 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.585316 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.585492 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.586631 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-config-data\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.588570 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.589978 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.590141 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.590569 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.590603 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-25113a91-49b5-491c-99f4-7569d427709f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5be60c8bd10711c8c5de45ccf3a82e8fb293e62f568476a547ebf3cff93a9a23/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.596381 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.597369 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.602398 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2f7h\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-kube-api-access-f2f7h\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.634110 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-25113a91-49b5-491c-99f4-7569d427709f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f\") pod \"rabbitmq-server-0\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.649316 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.650881 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.654879 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.654932 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.655157 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.656109 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.656316 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.656644 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-c4brf" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.657737 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.660952 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.709205 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.784729 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.784981 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-22914768-1216-46d8-b41a-338cdc0e977f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22914768-1216-46d8-b41a-338cdc0e977f\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.784999 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f394410a-5ff7-4a0c-84ec-4b60c63c707c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.785021 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.785037 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hqh2\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-kube-api-access-6hqh2\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.785090 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.785111 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.785129 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f394410a-5ff7-4a0c-84ec-4b60c63c707c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.785173 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.785194 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.785211 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.886756 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f394410a-5ff7-4a0c-84ec-4b60c63c707c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.886821 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.886847 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.886864 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.886881 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.886906 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-22914768-1216-46d8-b41a-338cdc0e977f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22914768-1216-46d8-b41a-338cdc0e977f\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.886921 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f394410a-5ff7-4a0c-84ec-4b60c63c707c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.886958 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.886978 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hqh2\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-kube-api-access-6hqh2\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.887032 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.887053 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.888125 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.888594 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.889047 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.889862 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.890591 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.891199 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f394410a-5ff7-4a0c-84ec-4b60c63c707c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.892287 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.892318 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-22914768-1216-46d8-b41a-338cdc0e977f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22914768-1216-46d8-b41a-338cdc0e977f\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d876d4ec8ae95b00698cc5f4700898c35211c994a02910946b2404f2e7b6a3a2/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.901451 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.907346 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f394410a-5ff7-4a0c-84ec-4b60c63c707c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.907348 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.915688 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hqh2\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-kube-api-access-6hqh2\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.941801 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-22914768-1216-46d8-b41a-338cdc0e977f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22914768-1216-46d8-b41a-338cdc0e977f\") pod \"rabbitmq-cell1-server-0\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:47 crc kubenswrapper[4914]: I0130 21:31:47.994249 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:31:48 crc kubenswrapper[4914]: I0130 21:31:48.843881 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 30 21:31:48 crc kubenswrapper[4914]: I0130 21:31:48.845422 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 21:31:48 crc kubenswrapper[4914]: I0130 21:31:48.848137 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 30 21:31:48 crc kubenswrapper[4914]: I0130 21:31:48.848567 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 30 21:31:48 crc kubenswrapper[4914]: I0130 21:31:48.852555 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-h4v89" Jan 30 21:31:48 crc kubenswrapper[4914]: I0130 21:31:48.852853 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 21:31:48 crc kubenswrapper[4914]: I0130 21:31:48.853102 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 30 21:31:48 crc kubenswrapper[4914]: I0130 21:31:48.858082 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.004303 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/63625a35-5028-4dda-b9b3-ec3910fd8385-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.004364 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/63625a35-5028-4dda-b9b3-ec3910fd8385-kolla-config\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.004385 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63625a35-5028-4dda-b9b3-ec3910fd8385-operator-scripts\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.004434 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/63625a35-5028-4dda-b9b3-ec3910fd8385-config-data-default\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.004479 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4c6555ba-57ab-43a0-ba83-7b69cd40a114\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c6555ba-57ab-43a0-ba83-7b69cd40a114\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.004517 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9m75\" (UniqueName: \"kubernetes.io/projected/63625a35-5028-4dda-b9b3-ec3910fd8385-kube-api-access-d9m75\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.004559 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63625a35-5028-4dda-b9b3-ec3910fd8385-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.004579 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/63625a35-5028-4dda-b9b3-ec3910fd8385-config-data-generated\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.106251 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/63625a35-5028-4dda-b9b3-ec3910fd8385-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.106299 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/63625a35-5028-4dda-b9b3-ec3910fd8385-kolla-config\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.106313 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63625a35-5028-4dda-b9b3-ec3910fd8385-operator-scripts\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.106345 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/63625a35-5028-4dda-b9b3-ec3910fd8385-config-data-default\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.106371 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4c6555ba-57ab-43a0-ba83-7b69cd40a114\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c6555ba-57ab-43a0-ba83-7b69cd40a114\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.106391 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9m75\" (UniqueName: \"kubernetes.io/projected/63625a35-5028-4dda-b9b3-ec3910fd8385-kube-api-access-d9m75\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.106425 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63625a35-5028-4dda-b9b3-ec3910fd8385-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.106442 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/63625a35-5028-4dda-b9b3-ec3910fd8385-config-data-generated\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.106863 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/63625a35-5028-4dda-b9b3-ec3910fd8385-config-data-generated\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.107399 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/63625a35-5028-4dda-b9b3-ec3910fd8385-kolla-config\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.107532 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/63625a35-5028-4dda-b9b3-ec3910fd8385-config-data-default\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.108494 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63625a35-5028-4dda-b9b3-ec3910fd8385-operator-scripts\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.108994 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.109278 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4c6555ba-57ab-43a0-ba83-7b69cd40a114\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c6555ba-57ab-43a0-ba83-7b69cd40a114\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/46ecffea2e9537b804e457d8db7c8ba0d287f032f9e7d57e9977f99d6dda7166/globalmount\"" pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.120776 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/63625a35-5028-4dda-b9b3-ec3910fd8385-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.123068 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63625a35-5028-4dda-b9b3-ec3910fd8385-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.130522 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9m75\" (UniqueName: \"kubernetes.io/projected/63625a35-5028-4dda-b9b3-ec3910fd8385-kube-api-access-d9m75\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.167728 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4c6555ba-57ab-43a0-ba83-7b69cd40a114\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c6555ba-57ab-43a0-ba83-7b69cd40a114\") pod \"openstack-galera-0\" (UID: \"63625a35-5028-4dda-b9b3-ec3910fd8385\") " pod="openstack/openstack-galera-0" Jan 30 21:31:49 crc kubenswrapper[4914]: I0130 21:31:49.174144 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.180541 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.182052 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.184185 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9cm59" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.184799 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.184828 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.184996 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.198024 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.326469 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/da3bc7da-e810-4d0a-a7df-792c544f3a23-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.326538 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/da3bc7da-e810-4d0a-a7df-792c544f3a23-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.326574 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3bc7da-e810-4d0a-a7df-792c544f3a23-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.326626 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vc28\" (UniqueName: \"kubernetes.io/projected/da3bc7da-e810-4d0a-a7df-792c544f3a23-kube-api-access-9vc28\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.326655 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5d8536c2-93d4-40e7-9fcb-dc25fee589db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8536c2-93d4-40e7-9fcb-dc25fee589db\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.326747 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/da3bc7da-e810-4d0a-a7df-792c544f3a23-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.326770 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da3bc7da-e810-4d0a-a7df-792c544f3a23-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.326816 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da3bc7da-e810-4d0a-a7df-792c544f3a23-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.371992 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.372920 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.376744 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-wdkms" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.376949 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.377127 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.382167 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.428009 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/da3bc7da-e810-4d0a-a7df-792c544f3a23-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.429124 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/da3bc7da-e810-4d0a-a7df-792c544f3a23-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.429185 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3bc7da-e810-4d0a-a7df-792c544f3a23-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.429238 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vc28\" (UniqueName: \"kubernetes.io/projected/da3bc7da-e810-4d0a-a7df-792c544f3a23-kube-api-access-9vc28\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.429272 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5d8536c2-93d4-40e7-9fcb-dc25fee589db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8536c2-93d4-40e7-9fcb-dc25fee589db\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.429343 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/da3bc7da-e810-4d0a-a7df-792c544f3a23-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.429369 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da3bc7da-e810-4d0a-a7df-792c544f3a23-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.429392 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da3bc7da-e810-4d0a-a7df-792c544f3a23-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.429676 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/da3bc7da-e810-4d0a-a7df-792c544f3a23-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.430203 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da3bc7da-e810-4d0a-a7df-792c544f3a23-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.430363 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/da3bc7da-e810-4d0a-a7df-792c544f3a23-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.431576 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da3bc7da-e810-4d0a-a7df-792c544f3a23-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.432895 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/da3bc7da-e810-4d0a-a7df-792c544f3a23-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.435293 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3bc7da-e810-4d0a-a7df-792c544f3a23-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.438919 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.438966 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5d8536c2-93d4-40e7-9fcb-dc25fee589db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8536c2-93d4-40e7-9fcb-dc25fee589db\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a5ff983e02a6cd630a48140990def2f8a6ef4b2917a1c3bc89d3201efeec2b78/globalmount\"" pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.447445 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vc28\" (UniqueName: \"kubernetes.io/projected/da3bc7da-e810-4d0a-a7df-792c544f3a23-kube-api-access-9vc28\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.478887 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5d8536c2-93d4-40e7-9fcb-dc25fee589db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8536c2-93d4-40e7-9fcb-dc25fee589db\") pod \"openstack-cell1-galera-0\" (UID: \"da3bc7da-e810-4d0a-a7df-792c544f3a23\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.505541 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.530965 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-config-data\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.531299 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7xwv\" (UniqueName: \"kubernetes.io/projected/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-kube-api-access-q7xwv\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.531358 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-kolla-config\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.531399 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.531422 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.632862 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.632909 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.632950 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-config-data\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.632973 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7xwv\" (UniqueName: \"kubernetes.io/projected/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-kube-api-access-q7xwv\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.633023 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-kolla-config\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.633719 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-kolla-config\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.634840 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-config-data\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.636852 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.639010 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.654821 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7xwv\" (UniqueName: \"kubernetes.io/projected/1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0-kube-api-access-q7xwv\") pod \"memcached-0\" (UID: \"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0\") " pod="openstack/memcached-0" Jan 30 21:31:50 crc kubenswrapper[4914]: I0130 21:31:50.696837 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 21:31:52 crc kubenswrapper[4914]: I0130 21:31:52.040770 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" event={"ID":"fa9f2445-e517-4f92-a54e-6008fc190663","Type":"ContainerStarted","Data":"ad6f410a45bb66f49bd1985d2c7c9f463d0e0baa658e592afd1ca03af987d485"} Jan 30 21:31:52 crc kubenswrapper[4914]: I0130 21:31:52.289417 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:31:52 crc kubenswrapper[4914]: I0130 21:31:52.290310 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 21:31:52 crc kubenswrapper[4914]: I0130 21:31:52.297660 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-hc9kb" Jan 30 21:31:52 crc kubenswrapper[4914]: I0130 21:31:52.310316 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:31:52 crc kubenswrapper[4914]: I0130 21:31:52.357914 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knszm\" (UniqueName: \"kubernetes.io/projected/134b35c4-3656-4890-8cb2-76bc09779403-kube-api-access-knszm\") pod \"kube-state-metrics-0\" (UID: \"134b35c4-3656-4890-8cb2-76bc09779403\") " pod="openstack/kube-state-metrics-0" Jan 30 21:31:52 crc kubenswrapper[4914]: I0130 21:31:52.461590 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knszm\" (UniqueName: \"kubernetes.io/projected/134b35c4-3656-4890-8cb2-76bc09779403-kube-api-access-knszm\") pod \"kube-state-metrics-0\" (UID: \"134b35c4-3656-4890-8cb2-76bc09779403\") " pod="openstack/kube-state-metrics-0" Jan 30 21:31:52 crc kubenswrapper[4914]: I0130 21:31:52.485063 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knszm\" (UniqueName: \"kubernetes.io/projected/134b35c4-3656-4890-8cb2-76bc09779403-kube-api-access-knszm\") pod \"kube-state-metrics-0\" (UID: \"134b35c4-3656-4890-8cb2-76bc09779403\") " pod="openstack/kube-state-metrics-0" Jan 30 21:31:52 crc kubenswrapper[4914]: I0130 21:31:52.608402 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.028403 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.030218 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.035675 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.035772 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.036001 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.037213 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.048806 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-r5zz5" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.059603 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.175517 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/46107121-a72c-40a7-904c-24c6c33de7c4-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.175553 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fktlx\" (UniqueName: \"kubernetes.io/projected/46107121-a72c-40a7-904c-24c6c33de7c4-kube-api-access-fktlx\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.175575 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/46107121-a72c-40a7-904c-24c6c33de7c4-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.175628 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/46107121-a72c-40a7-904c-24c6c33de7c4-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.175669 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/46107121-a72c-40a7-904c-24c6c33de7c4-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.175756 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/46107121-a72c-40a7-904c-24c6c33de7c4-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.175779 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/46107121-a72c-40a7-904c-24c6c33de7c4-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.276787 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/46107121-a72c-40a7-904c-24c6c33de7c4-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.276865 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/46107121-a72c-40a7-904c-24c6c33de7c4-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.276895 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/46107121-a72c-40a7-904c-24c6c33de7c4-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.276947 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/46107121-a72c-40a7-904c-24c6c33de7c4-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.276965 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fktlx\" (UniqueName: \"kubernetes.io/projected/46107121-a72c-40a7-904c-24c6c33de7c4-kube-api-access-fktlx\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.276982 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/46107121-a72c-40a7-904c-24c6c33de7c4-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.277007 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/46107121-a72c-40a7-904c-24c6c33de7c4-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.277511 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/46107121-a72c-40a7-904c-24c6c33de7c4-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.280096 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/46107121-a72c-40a7-904c-24c6c33de7c4-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.280855 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/46107121-a72c-40a7-904c-24c6c33de7c4-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.286988 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/46107121-a72c-40a7-904c-24c6c33de7c4-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.287263 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/46107121-a72c-40a7-904c-24c6c33de7c4-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.289295 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/46107121-a72c-40a7-904c-24c6c33de7c4-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.303152 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fktlx\" (UniqueName: \"kubernetes.io/projected/46107121-a72c-40a7-904c-24c6c33de7c4-kube-api-access-fktlx\") pod \"alertmanager-metric-storage-0\" (UID: \"46107121-a72c-40a7-904c-24c6c33de7c4\") " pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.349989 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.625313 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.627027 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.629523 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.629952 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-8cvf7" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.630230 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.630397 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.631137 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.631156 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.631901 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.631943 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.685694 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.785186 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.785238 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.785337 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.785382 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.785470 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.785487 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.785594 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.785668 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.785699 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.785856 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn2n5\" (UniqueName: \"kubernetes.io/projected/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-kube-api-access-mn2n5\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.887267 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.887319 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.887347 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.887366 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.887394 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.887410 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.887441 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.887473 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.887492 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.887524 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn2n5\" (UniqueName: \"kubernetes.io/projected/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-kube-api-access-mn2n5\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.892560 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.895335 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.895919 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.896355 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.898431 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.898919 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.903298 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-config\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.903750 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.904628 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn2n5\" (UniqueName: \"kubernetes.io/projected/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-kube-api-access-mn2n5\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.911009 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.911044 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/81f3275c3b55a1f740c68491d4a52891729addfc87ac8642d59b960166a498d8/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.936065 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\") pod \"prometheus-metric-storage-0\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:53 crc kubenswrapper[4914]: I0130 21:31:53.945793 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 21:31:55 crc kubenswrapper[4914]: I0130 21:31:55.124033 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 21:31:56 crc kubenswrapper[4914]: I0130 21:31:56.983250 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:31:56 crc kubenswrapper[4914]: I0130 21:31:56.983501 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:31:56 crc kubenswrapper[4914]: I0130 21:31:56.983541 4914 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:31:56 crc kubenswrapper[4914]: I0130 21:31:56.984179 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af122f4ba69a9f285a7275f9a58f9bcc4666b137ea591150601d02ec4dc641e5"} pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:31:56 crc kubenswrapper[4914]: I0130 21:31:56.984227 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" containerID="cri-o://af122f4ba69a9f285a7275f9a58f9bcc4666b137ea591150601d02ec4dc641e5" gracePeriod=600 Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.028611 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.030392 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.035133 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-l79xh" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.035367 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.035481 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.035629 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.041676 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.046724 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.147515 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abe9f42c-7055-4099-ad8e-f827973007cd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.147560 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5rl4\" (UniqueName: \"kubernetes.io/projected/abe9f42c-7055-4099-ad8e-f827973007cd-kube-api-access-w5rl4\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.147588 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abe9f42c-7055-4099-ad8e-f827973007cd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.147659 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3d527f13-7f99-40f4-ba37-f50ef247ab26\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d527f13-7f99-40f4-ba37-f50ef247ab26\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.147689 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abe9f42c-7055-4099-ad8e-f827973007cd-config\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.147918 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe9f42c-7055-4099-ad8e-f827973007cd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.148334 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe9f42c-7055-4099-ad8e-f827973007cd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.148387 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe9f42c-7055-4099-ad8e-f827973007cd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.249717 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe9f42c-7055-4099-ad8e-f827973007cd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.249774 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe9f42c-7055-4099-ad8e-f827973007cd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.249818 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abe9f42c-7055-4099-ad8e-f827973007cd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.249840 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5rl4\" (UniqueName: \"kubernetes.io/projected/abe9f42c-7055-4099-ad8e-f827973007cd-kube-api-access-w5rl4\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.249860 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abe9f42c-7055-4099-ad8e-f827973007cd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.249894 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3d527f13-7f99-40f4-ba37-f50ef247ab26\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d527f13-7f99-40f4-ba37-f50ef247ab26\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.249935 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abe9f42c-7055-4099-ad8e-f827973007cd-config\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.249950 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe9f42c-7055-4099-ad8e-f827973007cd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.250973 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abe9f42c-7055-4099-ad8e-f827973007cd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.251729 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abe9f42c-7055-4099-ad8e-f827973007cd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.253907 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abe9f42c-7055-4099-ad8e-f827973007cd-config\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.256517 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe9f42c-7055-4099-ad8e-f827973007cd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.256554 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe9f42c-7055-4099-ad8e-f827973007cd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.256686 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.256730 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3d527f13-7f99-40f4-ba37-f50ef247ab26\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d527f13-7f99-40f4-ba37-f50ef247ab26\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/93310374b71ed604e1c6c87b6a6437357a33bd5e041bf520f2a1ada5ea81f061/globalmount\"" pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.258981 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abe9f42c-7055-4099-ad8e-f827973007cd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.273203 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5rl4\" (UniqueName: \"kubernetes.io/projected/abe9f42c-7055-4099-ad8e-f827973007cd-kube-api-access-w5rl4\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.307101 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3d527f13-7f99-40f4-ba37-f50ef247ab26\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d527f13-7f99-40f4-ba37-f50ef247ab26\") pod \"ovsdbserver-nb-0\" (UID: \"abe9f42c-7055-4099-ad8e-f827973007cd\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.353373 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rdzm9"] Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.358580 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.363427 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.363555 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.363604 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-48sxj" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.368316 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rdzm9"] Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.403263 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.416695 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-kv2g9"] Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.419110 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.437204 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kv2g9"] Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.454659 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/11cefee1-f5e9-4f79-b25b-8dae49655475-var-lib\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.454747 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvcc4\" (UniqueName: \"kubernetes.io/projected/11cefee1-f5e9-4f79-b25b-8dae49655475-kube-api-access-dvcc4\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.454794 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f063a16-987d-4378-b889-966755034c3e-ovn-controller-tls-certs\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.454897 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f063a16-987d-4378-b889-966755034c3e-combined-ca-bundle\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.455036 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11cefee1-f5e9-4f79-b25b-8dae49655475-scripts\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.455063 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3f063a16-987d-4378-b889-966755034c3e-var-run-ovn\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.455085 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3f063a16-987d-4378-b889-966755034c3e-var-run\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.455131 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3f063a16-987d-4378-b889-966755034c3e-var-log-ovn\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.455158 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjpv2\" (UniqueName: \"kubernetes.io/projected/3f063a16-987d-4378-b889-966755034c3e-kube-api-access-wjpv2\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.455287 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f063a16-987d-4378-b889-966755034c3e-scripts\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.455309 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/11cefee1-f5e9-4f79-b25b-8dae49655475-etc-ovs\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.455358 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/11cefee1-f5e9-4f79-b25b-8dae49655475-var-run\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.455616 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/11cefee1-f5e9-4f79-b25b-8dae49655475-var-log\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.556983 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/11cefee1-f5e9-4f79-b25b-8dae49655475-var-log\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557065 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/11cefee1-f5e9-4f79-b25b-8dae49655475-var-lib\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557110 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvcc4\" (UniqueName: \"kubernetes.io/projected/11cefee1-f5e9-4f79-b25b-8dae49655475-kube-api-access-dvcc4\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557142 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f063a16-987d-4378-b889-966755034c3e-ovn-controller-tls-certs\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557200 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f063a16-987d-4378-b889-966755034c3e-combined-ca-bundle\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557230 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11cefee1-f5e9-4f79-b25b-8dae49655475-scripts\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557267 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3f063a16-987d-4378-b889-966755034c3e-var-run-ovn\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557284 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3f063a16-987d-4378-b889-966755034c3e-var-run\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557301 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3f063a16-987d-4378-b889-966755034c3e-var-log-ovn\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557319 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjpv2\" (UniqueName: \"kubernetes.io/projected/3f063a16-987d-4378-b889-966755034c3e-kube-api-access-wjpv2\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557362 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f063a16-987d-4378-b889-966755034c3e-scripts\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557377 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/11cefee1-f5e9-4f79-b25b-8dae49655475-etc-ovs\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557395 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/11cefee1-f5e9-4f79-b25b-8dae49655475-var-run\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557557 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/11cefee1-f5e9-4f79-b25b-8dae49655475-var-log\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557639 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/11cefee1-f5e9-4f79-b25b-8dae49655475-var-lib\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557787 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/11cefee1-f5e9-4f79-b25b-8dae49655475-var-run\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557824 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3f063a16-987d-4378-b889-966755034c3e-var-run\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557906 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/11cefee1-f5e9-4f79-b25b-8dae49655475-etc-ovs\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557925 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3f063a16-987d-4378-b889-966755034c3e-var-log-ovn\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.557948 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3f063a16-987d-4378-b889-966755034c3e-var-run-ovn\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.560390 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11cefee1-f5e9-4f79-b25b-8dae49655475-scripts\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.560409 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f063a16-987d-4378-b889-966755034c3e-scripts\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.560566 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f063a16-987d-4378-b889-966755034c3e-ovn-controller-tls-certs\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.563319 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f063a16-987d-4378-b889-966755034c3e-combined-ca-bundle\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.577435 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvcc4\" (UniqueName: \"kubernetes.io/projected/11cefee1-f5e9-4f79-b25b-8dae49655475-kube-api-access-dvcc4\") pod \"ovn-controller-ovs-kv2g9\" (UID: \"11cefee1-f5e9-4f79-b25b-8dae49655475\") " pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.578053 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjpv2\" (UniqueName: \"kubernetes.io/projected/3f063a16-987d-4378-b889-966755034c3e-kube-api-access-wjpv2\") pod \"ovn-controller-rdzm9\" (UID: \"3f063a16-987d-4378-b889-966755034c3e\") " pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.684875 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rdzm9" Jan 30 21:31:57 crc kubenswrapper[4914]: I0130 21:31:57.735655 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:31:58 crc kubenswrapper[4914]: I0130 21:31:58.109082 4914 generic.go:334] "Generic (PLEG): container finished" podID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerID="af122f4ba69a9f285a7275f9a58f9bcc4666b137ea591150601d02ec4dc641e5" exitCode=0 Jan 30 21:31:58 crc kubenswrapper[4914]: I0130 21:31:58.109133 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerDied","Data":"af122f4ba69a9f285a7275f9a58f9bcc4666b137ea591150601d02ec4dc641e5"} Jan 30 21:31:58 crc kubenswrapper[4914]: I0130 21:31:58.109175 4914 scope.go:117] "RemoveContainer" containerID="e121058e768dda1d14fe4563b4b94e4252170909803ddfd6651100686fef20ef" Jan 30 21:32:00 crc kubenswrapper[4914]: I0130 21:32:00.956107 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 21:32:00 crc kubenswrapper[4914]: I0130 21:32:00.957871 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:00 crc kubenswrapper[4914]: I0130 21:32:00.966184 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-rd6vn" Jan 30 21:32:00 crc kubenswrapper[4914]: I0130 21:32:00.967482 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 30 21:32:00 crc kubenswrapper[4914]: I0130 21:32:00.967841 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 30 21:32:00 crc kubenswrapper[4914]: I0130 21:32:00.968046 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 30 21:32:00 crc kubenswrapper[4914]: I0130 21:32:00.985999 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.045478 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d8330-2863-4fe8-96b8-2a751de6569d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.045536 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555d8330-2863-4fe8-96b8-2a751de6569d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.045567 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-591beee8-7671-4842-8744-a1f75753f221\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-591beee8-7671-4842-8744-a1f75753f221\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.045609 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d8330-2863-4fe8-96b8-2a751de6569d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.045646 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555d8330-2863-4fe8-96b8-2a751de6569d-config\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.045675 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/555d8330-2863-4fe8-96b8-2a751de6569d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.045692 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/555d8330-2863-4fe8-96b8-2a751de6569d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.045755 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrnc9\" (UniqueName: \"kubernetes.io/projected/555d8330-2863-4fe8-96b8-2a751de6569d-kube-api-access-xrnc9\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.146951 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d8330-2863-4fe8-96b8-2a751de6569d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.146998 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-591beee8-7671-4842-8744-a1f75753f221\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-591beee8-7671-4842-8744-a1f75753f221\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.147036 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555d8330-2863-4fe8-96b8-2a751de6569d-config\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.147070 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/555d8330-2863-4fe8-96b8-2a751de6569d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.147107 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/555d8330-2863-4fe8-96b8-2a751de6569d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.147222 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrnc9\" (UniqueName: \"kubernetes.io/projected/555d8330-2863-4fe8-96b8-2a751de6569d-kube-api-access-xrnc9\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.147268 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d8330-2863-4fe8-96b8-2a751de6569d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.147337 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555d8330-2863-4fe8-96b8-2a751de6569d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.147873 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555d8330-2863-4fe8-96b8-2a751de6569d-config\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.147989 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/555d8330-2863-4fe8-96b8-2a751de6569d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.152448 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d8330-2863-4fe8-96b8-2a751de6569d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.153050 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/555d8330-2863-4fe8-96b8-2a751de6569d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.155837 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/555d8330-2863-4fe8-96b8-2a751de6569d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.165334 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555d8330-2863-4fe8-96b8-2a751de6569d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.174462 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrnc9\" (UniqueName: \"kubernetes.io/projected/555d8330-2863-4fe8-96b8-2a751de6569d-kube-api-access-xrnc9\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.177215 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.177264 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-591beee8-7671-4842-8744-a1f75753f221\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-591beee8-7671-4842-8744-a1f75753f221\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f636da9148c7699e71bfb3f3e90b795eb28f9417cf8faaf1802093dbf91cfa98/globalmount\"" pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.233308 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-591beee8-7671-4842-8744-a1f75753f221\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-591beee8-7671-4842-8744-a1f75753f221\") pod \"ovsdbserver-sb-0\" (UID: \"555d8330-2863-4fe8-96b8-2a751de6569d\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:01 crc kubenswrapper[4914]: I0130 21:32:01.280070 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.845263 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rnjqw"] Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.876482 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44"] Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.878186 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.880945 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.881153 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-h9zhm" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.881467 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.885361 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.885598 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.892814 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/915fbbd9-20c9-4552-bf18-a61af008b1d8-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.892968 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/915fbbd9-20c9-4552-bf18-a61af008b1d8-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.893048 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw5ld\" (UniqueName: \"kubernetes.io/projected/915fbbd9-20c9-4552-bf18-a61af008b1d8-kube-api-access-hw5ld\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.893101 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/915fbbd9-20c9-4552-bf18-a61af008b1d8-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.893123 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/915fbbd9-20c9-4552-bf18-a61af008b1d8-config\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.904751 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44"] Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.994988 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw5ld\" (UniqueName: \"kubernetes.io/projected/915fbbd9-20c9-4552-bf18-a61af008b1d8-kube-api-access-hw5ld\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.995371 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/915fbbd9-20c9-4552-bf18-a61af008b1d8-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.995403 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/915fbbd9-20c9-4552-bf18-a61af008b1d8-config\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.995442 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/915fbbd9-20c9-4552-bf18-a61af008b1d8-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.995564 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/915fbbd9-20c9-4552-bf18-a61af008b1d8-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.996902 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/915fbbd9-20c9-4552-bf18-a61af008b1d8-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:02 crc kubenswrapper[4914]: I0130 21:32:02.997847 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/915fbbd9-20c9-4552-bf18-a61af008b1d8-config\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.004778 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/915fbbd9-20c9-4552-bf18-a61af008b1d8-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.009405 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/915fbbd9-20c9-4552-bf18-a61af008b1d8-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.043643 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw5ld\" (UniqueName: \"kubernetes.io/projected/915fbbd9-20c9-4552-bf18-a61af008b1d8-kube-api-access-hw5ld\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-5wh44\" (UID: \"915fbbd9-20c9-4552-bf18-a61af008b1d8\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.048037 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr"] Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.053875 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.062128 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.062345 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.062490 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.067280 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr"] Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.151347 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k"] Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.161043 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k"] Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.161135 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.163918 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.168047 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.201569 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-config\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.201620 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.201642 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.201694 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl8x7\" (UniqueName: \"kubernetes.io/projected/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-kube-api-access-zl8x7\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.201748 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.201778 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.203004 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.274486 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv"] Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.275488 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.279688 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.280079 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.280299 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.280465 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-84cq8" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.281278 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.281474 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.281971 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.303154 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv"] Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304558 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304594 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304628 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304646 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-config\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304670 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304697 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304737 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/844bac7f-9f50-49c2-a05c-963b99ca4490-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304757 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bjt8\" (UniqueName: \"kubernetes.io/projected/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-kube-api-access-7bjt8\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304776 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304796 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-config\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304818 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304835 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304853 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304869 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304896 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/844bac7f-9f50-49c2-a05c-963b99ca4490-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304919 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgmw6\" (UniqueName: \"kubernetes.io/projected/844bac7f-9f50-49c2-a05c-963b99ca4490-kube-api-access-tgmw6\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304937 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304954 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304972 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.304990 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl8x7\" (UniqueName: \"kubernetes.io/projected/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-kube-api-access-zl8x7\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.306590 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.307878 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-config\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.308636 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.312197 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.312847 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.321576 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj"] Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.322793 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.339640 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj"] Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.342382 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl8x7\" (UniqueName: \"kubernetes.io/projected/e528e0c0-c547-4d1d-8624-f8b2c8d450cf-kube-api-access-zl8x7\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-vq9hr\" (UID: \"e528e0c0-c547-4d1d-8624-f8b2c8d450cf\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.412960 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.413998 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/844bac7f-9f50-49c2-a05c-963b99ca4490-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.414034 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bjt8\" (UniqueName: \"kubernetes.io/projected/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-kube-api-access-7bjt8\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.414059 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.414090 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.414108 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.414131 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/844bac7f-9f50-49c2-a05c-963b99ca4490-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.414155 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgmw6\" (UniqueName: \"kubernetes.io/projected/844bac7f-9f50-49c2-a05c-963b99ca4490-kube-api-access-tgmw6\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.414176 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.414192 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.414212 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.414249 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.414284 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-config\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.414305 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.414333 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.415042 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.419732 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.420793 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: E0130 21:32:03.420822 4914 configmap.go:193] Couldn't get configMap openstack/cloudkitty-lokistack-gateway-ca-bundle: configmap "cloudkitty-lokistack-gateway-ca-bundle" not found Jan 30 21:32:03 crc kubenswrapper[4914]: E0130 21:32:03.420886 4914 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Jan 30 21:32:03 crc kubenswrapper[4914]: E0130 21:32:03.420916 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-lokistack-gateway-ca-bundle podName:844bac7f-9f50-49c2-a05c-963b99ca4490 nodeName:}" failed. No retries permitted until 2026-01-30 21:32:03.920889192 +0000 UTC m=+1057.359526033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloudkitty-lokistack-gateway-ca-bundle" (UniqueName: "kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-lokistack-gateway-ca-bundle") pod "cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" (UID: "844bac7f-9f50-49c2-a05c-963b99ca4490") : configmap "cloudkitty-lokistack-gateway-ca-bundle" not found Jan 30 21:32:03 crc kubenswrapper[4914]: E0130 21:32:03.420947 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/844bac7f-9f50-49c2-a05c-963b99ca4490-tls-secret podName:844bac7f-9f50-49c2-a05c-963b99ca4490 nodeName:}" failed. No retries permitted until 2026-01-30 21:32:03.920933203 +0000 UTC m=+1057.359570084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/844bac7f-9f50-49c2-a05c-963b99ca4490-tls-secret") pod "cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" (UID: "844bac7f-9f50-49c2-a05c-963b99ca4490") : secret "cloudkitty-lokistack-gateway-http" not found Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.421782 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.422002 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-config\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.426343 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.431170 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.431242 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.431574 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.431694 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/844bac7f-9f50-49c2-a05c-963b99ca4490-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.447341 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bjt8\" (UniqueName: \"kubernetes.io/projected/c2060bc5-fb2c-4421-b6a0-7acbd5549c8d-kube-api-access-7bjt8\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k\" (UID: \"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.448131 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgmw6\" (UniqueName: \"kubernetes.io/projected/844bac7f-9f50-49c2-a05c-963b99ca4490-kube-api-access-tgmw6\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.482489 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.516292 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.516582 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.516696 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.516821 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.517106 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.517228 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dbnt\" (UniqueName: \"kubernetes.io/projected/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-kube-api-access-2dbnt\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.517257 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.517459 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.517517 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.618885 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dbnt\" (UniqueName: \"kubernetes.io/projected/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-kube-api-access-2dbnt\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.618933 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.619042 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.619078 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.619167 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.619196 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.619237 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.619307 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.619367 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: E0130 21:32:03.619607 4914 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Jan 30 21:32:03 crc kubenswrapper[4914]: E0130 21:32:03.619773 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-tls-secret podName:2954a978-cc4d-4e5a-95af-d3bab9a9b3d1 nodeName:}" failed. No retries permitted until 2026-01-30 21:32:04.119753836 +0000 UTC m=+1057.558390607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-tls-secret") pod "cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" (UID: "2954a978-cc4d-4e5a-95af-d3bab9a9b3d1") : secret "cloudkitty-lokistack-gateway-http" not found Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.620532 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.621033 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.621293 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.622136 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.622346 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.629137 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.636814 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dbnt\" (UniqueName: \"kubernetes.io/projected/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-kube-api-access-2dbnt\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.639167 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.924470 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/844bac7f-9f50-49c2-a05c-963b99ca4490-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.925396 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.926398 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/844bac7f-9f50-49c2-a05c-963b99ca4490-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:03 crc kubenswrapper[4914]: I0130 21:32:03.928338 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/844bac7f-9f50-49c2-a05c-963b99ca4490-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv\" (UID: \"844bac7f-9f50-49c2-a05c-963b99ca4490\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.029896 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.031186 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.033821 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.035262 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.045415 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.125405 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.126657 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.129234 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.129276 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.129279 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.129366 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.129386 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.129406 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljhls\" (UniqueName: \"kubernetes.io/projected/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-kube-api-access-ljhls\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.129437 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.129463 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.129528 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.129785 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.129822 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.133487 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2954a978-cc4d-4e5a-95af-d3bab9a9b3d1-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj\" (UID: \"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.136176 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.200304 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.209065 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.211377 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.217490 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.219518 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.227810 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.233363 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.233474 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.233549 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.234391 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fxps\" (UniqueName: \"kubernetes.io/projected/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-kube-api-access-8fxps\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.234437 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.234514 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.234626 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.234688 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.234722 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.234774 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.234804 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.234825 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.234939 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljhls\" (UniqueName: \"kubernetes.io/projected/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-kube-api-access-ljhls\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.235124 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.235225 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.237758 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.238547 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.239173 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.239911 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.241806 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.246502 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.247995 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.257659 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.258046 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljhls\" (UniqueName: \"kubernetes.io/projected/79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0-kube-api-access-ljhls\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.261062 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.282298 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.337683 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.337773 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.337842 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.337867 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.337885 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.337909 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.337943 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.337950 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.337963 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fxps\" (UniqueName: \"kubernetes.io/projected/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-kube-api-access-8fxps\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.338213 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.338319 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.338354 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhbb7\" (UniqueName: \"kubernetes.io/projected/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-kube-api-access-nhbb7\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.338423 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.338463 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.338540 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.339001 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.339371 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.342104 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.343630 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.344943 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.345120 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.354940 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fxps\" (UniqueName: \"kubernetes.io/projected/1cd64ca8-c110-4af1-ad2e-edbed561a3b3-kube-api-access-8fxps\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.382812 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"1cd64ca8-c110-4af1-ad2e-edbed561a3b3\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.440019 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.440386 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.441562 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.440435 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhbb7\" (UniqueName: \"kubernetes.io/projected/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-kube-api-access-nhbb7\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.442937 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.443017 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.443100 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.443120 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.443781 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.444021 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.447274 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.450346 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.472296 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhbb7\" (UniqueName: \"kubernetes.io/projected/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-kube-api-access-nhbb7\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.476424 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.486608 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.496957 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:04 crc kubenswrapper[4914]: I0130 21:32:04.528451 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:05 crc kubenswrapper[4914]: W0130 21:32:05.223485 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63625a35_5028_4dda_b9b3_ec3910fd8385.slice/crio-7e3f6df36c20df9538ceddf8e65f59d27798cdc4c41684d073e6fb97310f824f WatchSource:0}: Error finding container 7e3f6df36c20df9538ceddf8e65f59d27798cdc4c41684d073e6fb97310f824f: Status 404 returned error can't find the container with id 7e3f6df36c20df9538ceddf8e65f59d27798cdc4c41684d073e6fb97310f824f Jan 30 21:32:05 crc kubenswrapper[4914]: E0130 21:32:05.231200 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 30 21:32:05 crc kubenswrapper[4914]: E0130 21:32:05.231430 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l5k9p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-m99ms_openstack(f117df6e-671b-4401-8056-0d094bf65b8b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:32:05 crc kubenswrapper[4914]: E0130 21:32:05.232620 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" podUID="f117df6e-671b-4401-8056-0d094bf65b8b" Jan 30 21:32:05 crc kubenswrapper[4914]: W0130 21:32:05.292427 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6edeafb_6617_4058_9b35_bf0bb078ceba.slice/crio-73037bf5311c6edc31d17452cc77921416acdeb8183b92775f460648d8b74a3e WatchSource:0}: Error finding container 73037bf5311c6edc31d17452cc77921416acdeb8183b92775f460648d8b74a3e: Status 404 returned error can't find the container with id 73037bf5311c6edc31d17452cc77921416acdeb8183b92775f460648d8b74a3e Jan 30 21:32:05 crc kubenswrapper[4914]: E0130 21:32:05.326129 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 30 21:32:05 crc kubenswrapper[4914]: E0130 21:32:05.326315 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xsnzc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-srgc7_openstack(eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:32:05 crc kubenswrapper[4914]: E0130 21:32:05.327494 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-srgc7" podUID="eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d" Jan 30 21:32:05 crc kubenswrapper[4914]: I0130 21:32:05.810458 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:32:06 crc kubenswrapper[4914]: I0130 21:32:06.180888 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"63625a35-5028-4dda-b9b3-ec3910fd8385","Type":"ContainerStarted","Data":"7e3f6df36c20df9538ceddf8e65f59d27798cdc4c41684d073e6fb97310f824f"} Jan 30 21:32:06 crc kubenswrapper[4914]: I0130 21:32:06.185052 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"f0fa301f4a7d6f2d2094968ff039d7aedbb13e612ee90301cf0076f1904de139"} Jan 30 21:32:06 crc kubenswrapper[4914]: I0130 21:32:06.194933 4914 generic.go:334] "Generic (PLEG): container finished" podID="a6edeafb-6617-4058-9b35-bf0bb078ceba" containerID="28b4e1bf009dd88101a6f8c6c40d3be9aceb260ab718b14aabf6b407e2c7c135" exitCode=0 Jan 30 21:32:06 crc kubenswrapper[4914]: I0130 21:32:06.195128 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" event={"ID":"a6edeafb-6617-4058-9b35-bf0bb078ceba","Type":"ContainerDied","Data":"28b4e1bf009dd88101a6f8c6c40d3be9aceb260ab718b14aabf6b407e2c7c135"} Jan 30 21:32:06 crc kubenswrapper[4914]: I0130 21:32:06.195259 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" event={"ID":"a6edeafb-6617-4058-9b35-bf0bb078ceba","Type":"ContainerStarted","Data":"73037bf5311c6edc31d17452cc77921416acdeb8183b92775f460648d8b74a3e"} Jan 30 21:32:06 crc kubenswrapper[4914]: I0130 21:32:06.207040 4914 generic.go:334] "Generic (PLEG): container finished" podID="fa9f2445-e517-4f92-a54e-6008fc190663" containerID="2cfbdc0f7e98bbbad96f0c6672a4c30d60ec519b33ec2740fc0be4311c5bd174" exitCode=0 Jan 30 21:32:06 crc kubenswrapper[4914]: I0130 21:32:06.207237 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" event={"ID":"fa9f2445-e517-4f92-a54e-6008fc190663","Type":"ContainerDied","Data":"2cfbdc0f7e98bbbad96f0c6672a4c30d60ec519b33ec2740fc0be4311c5bd174"} Jan 30 21:32:06 crc kubenswrapper[4914]: I0130 21:32:06.210342 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f394410a-5ff7-4a0c-84ec-4b60c63c707c","Type":"ContainerStarted","Data":"cdaf857891566689b9d960e23755cfc5fe462fa524e87d277df9c1cacdaa1b5b"} Jan 30 21:32:06 crc kubenswrapper[4914]: E0130 21:32:06.478025 4914 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Jan 30 21:32:06 crc kubenswrapper[4914]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/fa9f2445-e517-4f92-a54e-6008fc190663/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 30 21:32:06 crc kubenswrapper[4914]: > podSandboxID="ad6f410a45bb66f49bd1985d2c7c9f463d0e0baa658e592afd1ca03af987d485" Jan 30 21:32:06 crc kubenswrapper[4914]: E0130 21:32:06.478384 4914 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 21:32:06 crc kubenswrapper[4914]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cqs6b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-7jjwv_openstack(fa9f2445-e517-4f92-a54e-6008fc190663): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/fa9f2445-e517-4f92-a54e-6008fc190663/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 30 21:32:06 crc kubenswrapper[4914]: > logger="UnhandledError" Jan 30 21:32:06 crc kubenswrapper[4914]: E0130 21:32:06.479504 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/fa9f2445-e517-4f92-a54e-6008fc190663/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" podUID="fa9f2445-e517-4f92-a54e-6008fc190663" Jan 30 21:32:06 crc kubenswrapper[4914]: I0130 21:32:06.515002 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 21:32:06 crc kubenswrapper[4914]: W0130 21:32:06.530126 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46107121_a72c_40a7_904c_24c6c33de7c4.slice/crio-39d155d986eeb634c0dfc6f79c5b80c43045f72a0f7fc9970c20f4f615899937 WatchSource:0}: Error finding container 39d155d986eeb634c0dfc6f79c5b80c43045f72a0f7fc9970c20f4f615899937: Status 404 returned error can't find the container with id 39d155d986eeb634c0dfc6f79c5b80c43045f72a0f7fc9970c20f4f615899937 Jan 30 21:32:06 crc kubenswrapper[4914]: W0130 21:32:06.533504 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda3bc7da_e810_4d0a_a7df_792c544f3a23.slice/crio-5f126e0d3eeca0695b36e6d7a7fa089c8814af19c0ee11dc471f785153012e3a WatchSource:0}: Error finding container 5f126e0d3eeca0695b36e6d7a7fa089c8814af19c0ee11dc471f785153012e3a: Status 404 returned error can't find the container with id 5f126e0d3eeca0695b36e6d7a7fa089c8814af19c0ee11dc471f785153012e3a Jan 30 21:32:06 crc kubenswrapper[4914]: W0130 21:32:06.543455 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dbbcbee_a7d4_4638_9d20_dbeda6ccdde0.slice/crio-3328d3c2f7ab60d229b02560f00fac935a8372b236f64be35ae74bf53d45af3c WatchSource:0}: Error finding container 3328d3c2f7ab60d229b02560f00fac935a8372b236f64be35ae74bf53d45af3c: Status 404 returned error can't find the container with id 3328d3c2f7ab60d229b02560f00fac935a8372b236f64be35ae74bf53d45af3c Jan 30 21:32:06 crc kubenswrapper[4914]: I0130 21:32:06.543963 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 30 21:32:06 crc kubenswrapper[4914]: I0130 21:32:06.559556 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:32:06 crc kubenswrapper[4914]: I0130 21:32:06.566751 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.015294 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj"] Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.111785 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.174402 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k"] Jan 30 21:32:07 crc kubenswrapper[4914]: W0130 21:32:07.205942 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2060bc5_fb2c_4421_b6a0_7acbd5549c8d.slice/crio-9b6f24bff615fcdd26d367d5805c7206bb47254676931122609fefde82dcfd09 WatchSource:0}: Error finding container 9b6f24bff615fcdd26d367d5805c7206bb47254676931122609fefde82dcfd09: Status 404 returned error can't find the container with id 9b6f24bff615fcdd26d367d5805c7206bb47254676931122609fefde82dcfd09 Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.232843 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Jan 30 21:32:07 crc kubenswrapper[4914]: W0130 21:32:07.245737 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b4ebe0e_413b_4b5e_9239_a946ce2ca0f5.slice/crio-da8067f44204bc5deb3aac41f9de2cfdcd1b9d858219971139ba998ce84546c4 WatchSource:0}: Error finding container da8067f44204bc5deb3aac41f9de2cfdcd1b9d858219971139ba998ce84546c4: Status 404 returned error can't find the container with id da8067f44204bc5deb3aac41f9de2cfdcd1b9d858219971139ba998ce84546c4 Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.253665 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.255265 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" event={"ID":"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d","Type":"ContainerStarted","Data":"9b6f24bff615fcdd26d367d5805c7206bb47254676931122609fefde82dcfd09"} Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.257942 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"1cd64ca8-c110-4af1-ad2e-edbed561a3b3","Type":"ContainerStarted","Data":"ae001fd26f63d3d5832e1389a9c17e3fb5bec0fab079525c10cfa8a8c8ec2bba"} Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.260533 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0","Type":"ContainerStarted","Data":"3328d3c2f7ab60d229b02560f00fac935a8372b236f64be35ae74bf53d45af3c"} Jan 30 21:32:07 crc kubenswrapper[4914]: W0130 21:32:07.263303 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e2fd3a0_3c39_4fda_aa23_1a7d79f0d8e1.slice/crio-20ab92bdccb13fd50a7824249fa8712bd7ff89a2d162d0a4859a3093a334ca99 WatchSource:0}: Error finding container 20ab92bdccb13fd50a7824249fa8712bd7ff89a2d162d0a4859a3093a334ca99: Status 404 returned error can't find the container with id 20ab92bdccb13fd50a7824249fa8712bd7ff89a2d162d0a4859a3093a334ca99 Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.263852 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" event={"ID":"a6edeafb-6617-4058-9b35-bf0bb078ceba","Type":"ContainerStarted","Data":"7b38ff50c0e3fb47d27d5a3de46fcdbdf0923826b3487adc61fa3506e24def1c"} Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.264244 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.265940 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-srgc7" Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.266412 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"da3bc7da-e810-4d0a-a7df-792c544f3a23","Type":"ContainerStarted","Data":"5f126e0d3eeca0695b36e6d7a7fa089c8814af19c0ee11dc471f785153012e3a"} Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.267938 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.268827 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c506e0ae-e4b2-4cd7-87ea-bc10619f874e","Type":"ContainerStarted","Data":"a2496c0d0ae4fc3f67cdd3ffe07d3825c547990ddea2f3de0bdb1f06d1255103"} Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.270783 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" event={"ID":"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1","Type":"ContainerStarted","Data":"9c576685db6b9e9ad30b52ab1c26662c1033bcc25c3e948e341d3fd18c04ce9a"} Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.270872 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.274720 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"46107121-a72c-40a7-904c-24c6c33de7c4","Type":"ContainerStarted","Data":"39d155d986eeb634c0dfc6f79c5b80c43045f72a0f7fc9970c20f4f615899937"} Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.281845 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.292017 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" podStartSLOduration=20.797571792 podStartE2EDuration="21.29199816s" podCreationTimestamp="2026-01-30 21:31:46 +0000 UTC" firstStartedPulling="2026-01-30 21:32:05.30054964 +0000 UTC m=+1058.739186421" lastFinishedPulling="2026-01-30 21:32:05.794976028 +0000 UTC m=+1059.233612789" observedRunningTime="2026-01-30 21:32:07.285736911 +0000 UTC m=+1060.724373672" watchObservedRunningTime="2026-01-30 21:32:07.29199816 +0000 UTC m=+1060.730634921" Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.436730 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5k9p\" (UniqueName: \"kubernetes.io/projected/f117df6e-671b-4401-8056-0d094bf65b8b-kube-api-access-l5k9p\") pod \"f117df6e-671b-4401-8056-0d094bf65b8b\" (UID: \"f117df6e-671b-4401-8056-0d094bf65b8b\") " Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.436825 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f117df6e-671b-4401-8056-0d094bf65b8b-dns-svc\") pod \"f117df6e-671b-4401-8056-0d094bf65b8b\" (UID: \"f117df6e-671b-4401-8056-0d094bf65b8b\") " Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.436855 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f117df6e-671b-4401-8056-0d094bf65b8b-config\") pod \"f117df6e-671b-4401-8056-0d094bf65b8b\" (UID: \"f117df6e-671b-4401-8056-0d094bf65b8b\") " Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.436986 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsnzc\" (UniqueName: \"kubernetes.io/projected/eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d-kube-api-access-xsnzc\") pod \"eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d\" (UID: \"eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d\") " Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.437036 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d-config\") pod \"eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d\" (UID: \"eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d\") " Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.437241 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f117df6e-671b-4401-8056-0d094bf65b8b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f117df6e-671b-4401-8056-0d094bf65b8b" (UID: "f117df6e-671b-4401-8056-0d094bf65b8b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.437874 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d-config" (OuterVolumeSpecName: "config") pod "eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d" (UID: "eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.437960 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f117df6e-671b-4401-8056-0d094bf65b8b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.438453 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f117df6e-671b-4401-8056-0d094bf65b8b-config" (OuterVolumeSpecName: "config") pod "f117df6e-671b-4401-8056-0d094bf65b8b" (UID: "f117df6e-671b-4401-8056-0d094bf65b8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.442100 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d-kube-api-access-xsnzc" (OuterVolumeSpecName: "kube-api-access-xsnzc") pod "eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d" (UID: "eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d"). InnerVolumeSpecName "kube-api-access-xsnzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.442235 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f117df6e-671b-4401-8056-0d094bf65b8b-kube-api-access-l5k9p" (OuterVolumeSpecName: "kube-api-access-l5k9p") pod "f117df6e-671b-4401-8056-0d094bf65b8b" (UID: "f117df6e-671b-4401-8056-0d094bf65b8b"). InnerVolumeSpecName "kube-api-access-l5k9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.500694 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kv2g9"] Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.539134 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsnzc\" (UniqueName: \"kubernetes.io/projected/eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d-kube-api-access-xsnzc\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.539160 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.539168 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5k9p\" (UniqueName: \"kubernetes.io/projected/f117df6e-671b-4401-8056-0d094bf65b8b-kube-api-access-l5k9p\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.539177 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f117df6e-671b-4401-8056-0d094bf65b8b-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.608127 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44"] Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.624555 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr"] Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.633959 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rdzm9"] Jan 30 21:32:07 crc kubenswrapper[4914]: W0130 21:32:07.641730 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod844bac7f_9f50_49c2_a05c_963b99ca4490.slice/crio-130c67ef2bac9c8639ad0aafc00d6de63891d78f2462fe497e03a38a8205aff1 WatchSource:0}: Error finding container 130c67ef2bac9c8639ad0aafc00d6de63891d78f2462fe497e03a38a8205aff1: Status 404 returned error can't find the container with id 130c67ef2bac9c8639ad0aafc00d6de63891d78f2462fe497e03a38a8205aff1 Jan 30 21:32:07 crc kubenswrapper[4914]: W0130 21:32:07.644419 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f063a16_987d_4378_b889_966755034c3e.slice/crio-727fc2e3bdab15550236cbb9179e6bbe817d733347347caab38211f5a5c25d5c WatchSource:0}: Error finding container 727fc2e3bdab15550236cbb9179e6bbe817d733347347caab38211f5a5c25d5c: Status 404 returned error can't find the container with id 727fc2e3bdab15550236cbb9179e6bbe817d733347347caab38211f5a5c25d5c Jan 30 21:32:07 crc kubenswrapper[4914]: I0130 21:32:07.644379 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv"] Jan 30 21:32:07 crc kubenswrapper[4914]: W0130 21:32:07.645494 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod915fbbd9_20c9_4552_bf18_a61af008b1d8.slice/crio-dc9e3f4372121ac01950fef55cf8edfa4e5d8d56d7d21eeaf27d78ad789a5f2a WatchSource:0}: Error finding container dc9e3f4372121ac01950fef55cf8edfa4e5d8d56d7d21eeaf27d78ad789a5f2a: Status 404 returned error can't find the container with id dc9e3f4372121ac01950fef55cf8edfa4e5d8d56d7d21eeaf27d78ad789a5f2a Jan 30 21:32:07 crc kubenswrapper[4914]: E0130 21:32:07.649454 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-distributor,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2b491fcb180423632d30811515a439a7a7f41023c1cfe4780647f18969b85a1d,Command:[],Args:[-target=distributor -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-distributor-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-distributor-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hw5ld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-distributor-66dfd9bb-5wh44_openstack(915fbbd9-20c9-4552-bf18-a61af008b1d8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 21:32:07 crc kubenswrapper[4914]: E0130 21:32:07.650532 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" podUID="915fbbd9-20c9-4552-bf18-a61af008b1d8" Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.175724 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.282044 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-srgc7" event={"ID":"eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d","Type":"ContainerDied","Data":"6892df0d5b5d20e59555e60013bd63322529784f4826473504594e7fc1a0ce8d"} Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.282136 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-srgc7" Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.286832 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5","Type":"ContainerStarted","Data":"da8067f44204bc5deb3aac41f9de2cfdcd1b9d858219971139ba998ce84546c4"} Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.292975 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"134b35c4-3656-4890-8cb2-76bc09779403","Type":"ContainerStarted","Data":"2fbaaf1a1f3846be96c93c078f9f73d1c804ee7b19f16f8275ff2b251a9d20df"} Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.294722 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rdzm9" event={"ID":"3f063a16-987d-4378-b889-966755034c3e","Type":"ContainerStarted","Data":"727fc2e3bdab15550236cbb9179e6bbe817d733347347caab38211f5a5c25d5c"} Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.296613 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" event={"ID":"fa9f2445-e517-4f92-a54e-6008fc190663","Type":"ContainerStarted","Data":"1312e723afc227741e8020fe9a5bd52adef6f59e2dabd3195d13818c0bef2c54"} Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.297263 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.298422 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kv2g9" event={"ID":"11cefee1-f5e9-4f79-b25b-8dae49655475","Type":"ContainerStarted","Data":"4649512d73245173242c9b180a334b2a254deec7a898f48cda4a5e3d59ee1f70"} Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.298875 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.300652 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" event={"ID":"e528e0c0-c547-4d1d-8624-f8b2c8d450cf","Type":"ContainerStarted","Data":"c8333a7501245cbf3ee045377d0e6a56603f74df019ca867e2f7d8d26ef714f9"} Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.302201 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1","Type":"ContainerStarted","Data":"20ab92bdccb13fd50a7824249fa8712bd7ff89a2d162d0a4859a3093a334ca99"} Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.319081 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" event={"ID":"f117df6e-671b-4401-8056-0d094bf65b8b","Type":"ContainerDied","Data":"92fb36db52ab126ee5601ecc4b10af018e6112617c5a51c7aa4999947d2ce3ae"} Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.319535 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-m99ms" Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.322389 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0","Type":"ContainerStarted","Data":"f897b99a77a5e785baa2be95420fddc57e989c42eb76b04642bb06979ce379a5"} Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.324814 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" event={"ID":"915fbbd9-20c9-4552-bf18-a61af008b1d8","Type":"ContainerStarted","Data":"dc9e3f4372121ac01950fef55cf8edfa4e5d8d56d7d21eeaf27d78ad789a5f2a"} Jan 30 21:32:08 crc kubenswrapper[4914]: E0130 21:32:08.326411 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2b491fcb180423632d30811515a439a7a7f41023c1cfe4780647f18969b85a1d\\\"\"" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" podUID="915fbbd9-20c9-4552-bf18-a61af008b1d8" Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.339404 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-srgc7"] Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.347244 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-srgc7"] Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.349969 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" event={"ID":"844bac7f-9f50-49c2-a05c-963b99ca4490","Type":"ContainerStarted","Data":"130c67ef2bac9c8639ad0aafc00d6de63891d78f2462fe497e03a38a8205aff1"} Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.351552 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" podStartSLOduration=7.828279207 podStartE2EDuration="22.351538045s" podCreationTimestamp="2026-01-30 21:31:46 +0000 UTC" firstStartedPulling="2026-01-30 21:31:51.086092553 +0000 UTC m=+1044.524729314" lastFinishedPulling="2026-01-30 21:32:05.609351391 +0000 UTC m=+1059.047988152" observedRunningTime="2026-01-30 21:32:08.336545836 +0000 UTC m=+1061.775182597" watchObservedRunningTime="2026-01-30 21:32:08.351538045 +0000 UTC m=+1061.790174806" Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.390341 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-m99ms"] Jan 30 21:32:08 crc kubenswrapper[4914]: I0130 21:32:08.396755 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-m99ms"] Jan 30 21:32:09 crc kubenswrapper[4914]: E0130 21:32:09.360167 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2b491fcb180423632d30811515a439a7a7f41023c1cfe4780647f18969b85a1d\\\"\"" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" podUID="915fbbd9-20c9-4552-bf18-a61af008b1d8" Jan 30 21:32:09 crc kubenswrapper[4914]: I0130 21:32:09.837009 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d" path="/var/lib/kubelet/pods/eaf032c8-64a7-4bfb-9f3f-2c76e7874e1d/volumes" Jan 30 21:32:09 crc kubenswrapper[4914]: I0130 21:32:09.838004 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f117df6e-671b-4401-8056-0d094bf65b8b" path="/var/lib/kubelet/pods/f117df6e-671b-4401-8056-0d094bf65b8b/volumes" Jan 30 21:32:12 crc kubenswrapper[4914]: I0130 21:32:11.837263 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:32:12 crc kubenswrapper[4914]: I0130 21:32:11.909023 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7jjwv"] Jan 30 21:32:12 crc kubenswrapper[4914]: I0130 21:32:11.909539 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" podUID="fa9f2445-e517-4f92-a54e-6008fc190663" containerName="dnsmasq-dns" containerID="cri-o://1312e723afc227741e8020fe9a5bd52adef6f59e2dabd3195d13818c0bef2c54" gracePeriod=10 Jan 30 21:32:12 crc kubenswrapper[4914]: I0130 21:32:12.386926 4914 generic.go:334] "Generic (PLEG): container finished" podID="fa9f2445-e517-4f92-a54e-6008fc190663" containerID="1312e723afc227741e8020fe9a5bd52adef6f59e2dabd3195d13818c0bef2c54" exitCode=0 Jan 30 21:32:12 crc kubenswrapper[4914]: I0130 21:32:12.387218 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" event={"ID":"fa9f2445-e517-4f92-a54e-6008fc190663","Type":"ContainerDied","Data":"1312e723afc227741e8020fe9a5bd52adef6f59e2dabd3195d13818c0bef2c54"} Jan 30 21:32:12 crc kubenswrapper[4914]: W0130 21:32:12.504883 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod555d8330_2863_4fe8_96b8_2a751de6569d.slice/crio-0ef5b252542bf43ed0f8042bb91b117782cdf06cd9597410025b2a840e395e81 WatchSource:0}: Error finding container 0ef5b252542bf43ed0f8042bb91b117782cdf06cd9597410025b2a840e395e81: Status 404 returned error can't find the container with id 0ef5b252542bf43ed0f8042bb91b117782cdf06cd9597410025b2a840e395e81 Jan 30 21:32:12 crc kubenswrapper[4914]: W0130 21:32:12.506118 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabe9f42c_7055_4099_ad8e_f827973007cd.slice/crio-5f73b43afa8aa7be9d7b5a2e8fb6d63042731bacd9486cb1e629a370571feee9 WatchSource:0}: Error finding container 5f73b43afa8aa7be9d7b5a2e8fb6d63042731bacd9486cb1e629a370571feee9: Status 404 returned error can't find the container with id 5f73b43afa8aa7be9d7b5a2e8fb6d63042731bacd9486cb1e629a370571feee9 Jan 30 21:32:13 crc kubenswrapper[4914]: I0130 21:32:13.394858 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"555d8330-2863-4fe8-96b8-2a751de6569d","Type":"ContainerStarted","Data":"0ef5b252542bf43ed0f8042bb91b117782cdf06cd9597410025b2a840e395e81"} Jan 30 21:32:13 crc kubenswrapper[4914]: I0130 21:32:13.397305 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"abe9f42c-7055-4099-ad8e-f827973007cd","Type":"ContainerStarted","Data":"5f73b43afa8aa7be9d7b5a2e8fb6d63042731bacd9486cb1e629a370571feee9"} Jan 30 21:32:18 crc kubenswrapper[4914]: I0130 21:32:18.655916 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" Jan 30 21:32:18 crc kubenswrapper[4914]: I0130 21:32:18.744489 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqs6b\" (UniqueName: \"kubernetes.io/projected/fa9f2445-e517-4f92-a54e-6008fc190663-kube-api-access-cqs6b\") pod \"fa9f2445-e517-4f92-a54e-6008fc190663\" (UID: \"fa9f2445-e517-4f92-a54e-6008fc190663\") " Jan 30 21:32:18 crc kubenswrapper[4914]: I0130 21:32:18.744677 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9f2445-e517-4f92-a54e-6008fc190663-dns-svc\") pod \"fa9f2445-e517-4f92-a54e-6008fc190663\" (UID: \"fa9f2445-e517-4f92-a54e-6008fc190663\") " Jan 30 21:32:18 crc kubenswrapper[4914]: I0130 21:32:18.744815 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9f2445-e517-4f92-a54e-6008fc190663-config\") pod \"fa9f2445-e517-4f92-a54e-6008fc190663\" (UID: \"fa9f2445-e517-4f92-a54e-6008fc190663\") " Jan 30 21:32:18 crc kubenswrapper[4914]: I0130 21:32:18.753158 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9f2445-e517-4f92-a54e-6008fc190663-kube-api-access-cqs6b" (OuterVolumeSpecName: "kube-api-access-cqs6b") pod "fa9f2445-e517-4f92-a54e-6008fc190663" (UID: "fa9f2445-e517-4f92-a54e-6008fc190663"). InnerVolumeSpecName "kube-api-access-cqs6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:32:18 crc kubenswrapper[4914]: I0130 21:32:18.846980 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqs6b\" (UniqueName: \"kubernetes.io/projected/fa9f2445-e517-4f92-a54e-6008fc190663-kube-api-access-cqs6b\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:18 crc kubenswrapper[4914]: I0130 21:32:18.863361 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa9f2445-e517-4f92-a54e-6008fc190663-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa9f2445-e517-4f92-a54e-6008fc190663" (UID: "fa9f2445-e517-4f92-a54e-6008fc190663"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:32:18 crc kubenswrapper[4914]: I0130 21:32:18.864229 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa9f2445-e517-4f92-a54e-6008fc190663-config" (OuterVolumeSpecName: "config") pod "fa9f2445-e517-4f92-a54e-6008fc190663" (UID: "fa9f2445-e517-4f92-a54e-6008fc190663"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:32:18 crc kubenswrapper[4914]: I0130 21:32:18.952816 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9f2445-e517-4f92-a54e-6008fc190663-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:18 crc kubenswrapper[4914]: I0130 21:32:18.952847 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9f2445-e517-4f92-a54e-6008fc190663-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:19 crc kubenswrapper[4914]: I0130 21:32:19.452615 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" event={"ID":"fa9f2445-e517-4f92-a54e-6008fc190663","Type":"ContainerDied","Data":"ad6f410a45bb66f49bd1985d2c7c9f463d0e0baa658e592afd1ca03af987d485"} Jan 30 21:32:19 crc kubenswrapper[4914]: I0130 21:32:19.452673 4914 scope.go:117] "RemoveContainer" containerID="1312e723afc227741e8020fe9a5bd52adef6f59e2dabd3195d13818c0bef2c54" Jan 30 21:32:19 crc kubenswrapper[4914]: I0130 21:32:19.452745 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" Jan 30 21:32:19 crc kubenswrapper[4914]: I0130 21:32:19.494945 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7jjwv"] Jan 30 21:32:19 crc kubenswrapper[4914]: I0130 21:32:19.503170 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7jjwv"] Jan 30 21:32:19 crc kubenswrapper[4914]: I0130 21:32:19.834908 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9f2445-e517-4f92-a54e-6008fc190663" path="/var/lib/kubelet/pods/fa9f2445-e517-4f92-a54e-6008fc190663/volumes" Jan 30 21:32:20 crc kubenswrapper[4914]: E0130 21:32:20.669869 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 30 21:32:20 crc kubenswrapper[4914]: E0130 21:32:20.670478 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f2f7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(c506e0ae-e4b2-4cd7-87ea-bc10619f874e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:32:20 crc kubenswrapper[4914]: E0130 21:32:20.672809 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="c506e0ae-e4b2-4cd7-87ea-bc10619f874e" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.047328 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-68th2"] Jan 30 21:32:21 crc kubenswrapper[4914]: E0130 21:32:21.048046 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9f2445-e517-4f92-a54e-6008fc190663" containerName="dnsmasq-dns" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.048073 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9f2445-e517-4f92-a54e-6008fc190663" containerName="dnsmasq-dns" Jan 30 21:32:21 crc kubenswrapper[4914]: E0130 21:32:21.048104 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9f2445-e517-4f92-a54e-6008fc190663" containerName="init" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.048116 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9f2445-e517-4f92-a54e-6008fc190663" containerName="init" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.048592 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9f2445-e517-4f92-a54e-6008fc190663" containerName="dnsmasq-dns" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.050445 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.057775 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.058619 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-68th2"] Jan 30 21:32:21 crc kubenswrapper[4914]: E0130 21:32:21.079872 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a" Jan 30 21:32:21 crc kubenswrapper[4914]: E0130 21:32:21.080204 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init-config-reloader,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a,Command:[/bin/prometheus-config-reloader],Args:[--watch-interval=0 --listen-address=:8081 --config-file=/etc/alertmanager/config/alertmanager.yaml.gz --config-envsubst-file=/etc/alertmanager/config_out/alertmanager.env.yaml --watched-dir=/etc/alertmanager/config],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:reloader-init,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:SHARD,Value:-1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-volume,ReadOnly:true,MountPath:/etc/alertmanager/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-out,ReadOnly:false,MountPath:/etc/alertmanager/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:web-config,ReadOnly:true,MountPath:/etc/alertmanager/web_config/web-config.yaml,SubPath:web-config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fktlx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod alertmanager-metric-storage-0_openstack(46107121-a72c-40a7-904c-24c6c33de7c4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:32:21 crc kubenswrapper[4914]: E0130 21:32:21.081450 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-config-reloader\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/alertmanager-metric-storage-0" podUID="46107121-a72c-40a7-904c-24c6c33de7c4" Jan 30 21:32:21 crc kubenswrapper[4914]: E0130 21:32:21.126035 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 30 21:32:21 crc kubenswrapper[4914]: E0130 21:32:21.126201 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6hqh2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(f394410a-5ff7-4a0c-84ec-4b60c63c707c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:32:21 crc kubenswrapper[4914]: E0130 21:32:21.128142 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="f394410a-5ff7-4a0c-84ec-4b60c63c707c" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.211358 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-59mcr"] Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.212865 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.215117 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.216198 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c819be77-2b86-4bbf-9e4b-f9738f59032d-ovs-rundir\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.216300 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c819be77-2b86-4bbf-9e4b-f9738f59032d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.216338 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c819be77-2b86-4bbf-9e4b-f9738f59032d-ovn-rundir\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.216371 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c819be77-2b86-4bbf-9e4b-f9738f59032d-config\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.216409 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26khr\" (UniqueName: \"kubernetes.io/projected/c819be77-2b86-4bbf-9e4b-f9738f59032d-kube-api-access-26khr\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.216431 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c819be77-2b86-4bbf-9e4b-f9738f59032d-combined-ca-bundle\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.242801 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-59mcr"] Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.318581 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcpg8\" (UniqueName: \"kubernetes.io/projected/771fcfac-9ef4-440d-a70f-54808c02081f-kube-api-access-vcpg8\") pod \"dnsmasq-dns-7fd796d7df-59mcr\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.318681 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c819be77-2b86-4bbf-9e4b-f9738f59032d-ovs-rundir\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.318748 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-59mcr\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.318794 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c819be77-2b86-4bbf-9e4b-f9738f59032d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.318830 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-59mcr\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.318857 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c819be77-2b86-4bbf-9e4b-f9738f59032d-ovn-rundir\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.318899 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c819be77-2b86-4bbf-9e4b-f9738f59032d-config\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.318922 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-config\") pod \"dnsmasq-dns-7fd796d7df-59mcr\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.318968 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26khr\" (UniqueName: \"kubernetes.io/projected/c819be77-2b86-4bbf-9e4b-f9738f59032d-kube-api-access-26khr\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.318995 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c819be77-2b86-4bbf-9e4b-f9738f59032d-combined-ca-bundle\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.321008 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c819be77-2b86-4bbf-9e4b-f9738f59032d-ovs-rundir\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.321131 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c819be77-2b86-4bbf-9e4b-f9738f59032d-ovn-rundir\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.321596 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c819be77-2b86-4bbf-9e4b-f9738f59032d-config\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.326638 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c819be77-2b86-4bbf-9e4b-f9738f59032d-combined-ca-bundle\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.326989 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c819be77-2b86-4bbf-9e4b-f9738f59032d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.339286 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26khr\" (UniqueName: \"kubernetes.io/projected/c819be77-2b86-4bbf-9e4b-f9738f59032d-kube-api-access-26khr\") pod \"ovn-controller-metrics-68th2\" (UID: \"c819be77-2b86-4bbf-9e4b-f9738f59032d\") " pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.351833 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-59mcr"] Jan 30 21:32:21 crc kubenswrapper[4914]: E0130 21:32:21.352844 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-vcpg8 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" podUID="771fcfac-9ef4-440d-a70f-54808c02081f" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.382845 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-q69cm"] Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.392354 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.396009 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.407655 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-q69cm"] Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.419200 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-68th2" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.420476 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcpg8\" (UniqueName: \"kubernetes.io/projected/771fcfac-9ef4-440d-a70f-54808c02081f-kube-api-access-vcpg8\") pod \"dnsmasq-dns-7fd796d7df-59mcr\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.420543 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-59mcr\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.420589 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-59mcr\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.420625 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-config\") pod \"dnsmasq-dns-7fd796d7df-59mcr\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.421479 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-config\") pod \"dnsmasq-dns-7fd796d7df-59mcr\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.423363 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-59mcr\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.424073 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-59mcr\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.444733 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcpg8\" (UniqueName: \"kubernetes.io/projected/771fcfac-9ef4-440d-a70f-54808c02081f-kube-api-access-vcpg8\") pod \"dnsmasq-dns-7fd796d7df-59mcr\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.468813 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: E0130 21:32:21.485192 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="f394410a-5ff7-4a0c-84ec-4b60c63c707c" Jan 30 21:32:21 crc kubenswrapper[4914]: E0130 21:32:21.485267 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-config-reloader\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a\\\"\"" pod="openstack/alertmanager-metric-storage-0" podUID="46107121-a72c-40a7-904c-24c6c33de7c4" Jan 30 21:32:21 crc kubenswrapper[4914]: E0130 21:32:21.484984 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="c506e0ae-e4b2-4cd7-87ea-bc10619f874e" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.487446 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.527515 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftgnd\" (UniqueName: \"kubernetes.io/projected/78a46162-d658-4f7b-8643-1ffca846a558-kube-api-access-ftgnd\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.527678 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.527776 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-config\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.527803 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.527919 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.562610 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-7jjwv" podUID="fa9f2445-e517-4f92-a54e-6008fc190663" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.105:5353: i/o timeout" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.630422 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-dns-svc\") pod \"771fcfac-9ef4-440d-a70f-54808c02081f\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.630575 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-config\") pod \"771fcfac-9ef4-440d-a70f-54808c02081f\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.630648 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcpg8\" (UniqueName: \"kubernetes.io/projected/771fcfac-9ef4-440d-a70f-54808c02081f-kube-api-access-vcpg8\") pod \"771fcfac-9ef4-440d-a70f-54808c02081f\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.630741 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-ovsdbserver-nb\") pod \"771fcfac-9ef4-440d-a70f-54808c02081f\" (UID: \"771fcfac-9ef4-440d-a70f-54808c02081f\") " Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.630884 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "771fcfac-9ef4-440d-a70f-54808c02081f" (UID: "771fcfac-9ef4-440d-a70f-54808c02081f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.631059 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.631117 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "771fcfac-9ef4-440d-a70f-54808c02081f" (UID: "771fcfac-9ef4-440d-a70f-54808c02081f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.631296 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-config" (OuterVolumeSpecName: "config") pod "771fcfac-9ef4-440d-a70f-54808c02081f" (UID: "771fcfac-9ef4-440d-a70f-54808c02081f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.631536 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-config\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.631568 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.631823 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.632364 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-config\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.632396 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.632602 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.632801 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftgnd\" (UniqueName: \"kubernetes.io/projected/78a46162-d658-4f7b-8643-1ffca846a558-kube-api-access-ftgnd\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.632957 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.632982 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.632994 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/771fcfac-9ef4-440d-a70f-54808c02081f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.633955 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.642081 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/771fcfac-9ef4-440d-a70f-54808c02081f-kube-api-access-vcpg8" (OuterVolumeSpecName: "kube-api-access-vcpg8") pod "771fcfac-9ef4-440d-a70f-54808c02081f" (UID: "771fcfac-9ef4-440d-a70f-54808c02081f"). InnerVolumeSpecName "kube-api-access-vcpg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.647941 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftgnd\" (UniqueName: \"kubernetes.io/projected/78a46162-d658-4f7b-8643-1ffca846a558-kube-api-access-ftgnd\") pod \"dnsmasq-dns-86db49b7ff-q69cm\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.734201 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:21 crc kubenswrapper[4914]: I0130 21:32:21.734905 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcpg8\" (UniqueName: \"kubernetes.io/projected/771fcfac-9ef4-440d-a70f-54808c02081f-kube-api-access-vcpg8\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:22 crc kubenswrapper[4914]: I0130 21:32:22.482579 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-59mcr" Jan 30 21:32:22 crc kubenswrapper[4914]: I0130 21:32:22.554661 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-59mcr"] Jan 30 21:32:22 crc kubenswrapper[4914]: I0130 21:32:22.558347 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-59mcr"] Jan 30 21:32:23 crc kubenswrapper[4914]: I0130 21:32:23.833259 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="771fcfac-9ef4-440d-a70f-54808c02081f" path="/var/lib/kubelet/pods/771fcfac-9ef4-440d-a70f-54808c02081f/volumes" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.019899 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.020100 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n664h54ch74h547h55fh694hc6h579h5ch68h5d5h684h59fh57fhb4h5bfh5ffh656h55fh55fhd4h66dhdfh676h9bh5f7h58h56h684h557h98h675q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dvcc4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-kv2g9_openstack(11cefee1-f5e9-4f79-b25b-8dae49655475): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.021350 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-ovs-kv2g9" podUID="11cefee1-f5e9-4f79-b25b-8dae49655475" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.021725 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.021932 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d9m75,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(63625a35-5028-4dda-b9b3-ec3910fd8385): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.023821 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="63625a35-5028-4dda-b9b3-ec3910fd8385" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.081521 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.081734 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init-config-reloader,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a,Command:[/bin/prometheus-config-reloader],Args:[--watch-interval=0 --listen-address=:8081 --config-file=/etc/prometheus/config/prometheus.yaml.gz --config-envsubst-file=/etc/prometheus/config_out/prometheus.env.yaml --watched-dir=/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0 --watched-dir=/etc/prometheus/rules/prometheus-metric-storage-rulefiles-1 --watched-dir=/etc/prometheus/rules/prometheus-metric-storage-rulefiles-2],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:reloader-init,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:SHARD,Value:0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/prometheus/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-out,ReadOnly:false,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-1,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-1,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-2,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-2,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mn2n5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.082926 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-config-reloader\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.149933 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.150076 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9vc28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(da3bc7da-e810-4d0a-a7df-792c544f3a23): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.151145 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="da3bc7da-e810-4d0a-a7df-792c544f3a23" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.506289 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified\\\"\"" pod="openstack/ovn-controller-ovs-kv2g9" podUID="11cefee1-f5e9-4f79-b25b-8dae49655475" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.506504 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="da3bc7da-e810-4d0a-a7df-792c544f3a23" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.506569 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-config-reloader\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.506579 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="63625a35-5028-4dda-b9b3-ec3910fd8385" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.849693 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:74d61619b9420655da84bc9939e37f76040b437a70e9c96eeb3267f00dfe88ad" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.850173 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:gateway,Image:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:74d61619b9420655da84bc9939e37f76040b437a70e9c96eeb3267f00dfe88ad,Command:[],Args:[--debug.name=lokistack-gateway --web.listen=0.0.0.0:8080 --web.internal.listen=0.0.0.0:8081 --web.healthchecks.url=https://localhost:8080 --log.level=warn --logs.read.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.tail.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.write.endpoint=https://cloudkitty-lokistack-distributor-http.openstack.svc.cluster.local:3100 --logs.write-timeout=4m0s --rbac.config=/etc/lokistack-gateway/rbac.yaml --tenants.config=/etc/lokistack-gateway/tenants.yaml --server.read-timeout=48s --server.write-timeout=6m0s --tls.min-version=VersionTLS12 --tls.server.cert-file=/var/run/tls/http/server/tls.crt --tls.server.key-file=/var/run/tls/http/server/tls.key --tls.healthchecks.server-ca-file=/var/run/ca/server/service-ca.crt --tls.healthchecks.server-name=cloudkitty-lokistack-gateway-http.openstack.svc.cluster.local --tls.internal.server.cert-file=/var/run/tls/http/server/tls.crt --tls.internal.server.key-file=/var/run/tls/http/server/tls.key --tls.min-version=VersionTLS12 --tls.cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --logs.tls.ca-file=/var/run/ca/upstream/service-ca.crt --logs.tls.cert-file=/var/run/tls/http/upstream/tls.crt --logs.tls.key-file=/var/run/tls/http/upstream/tls.key --tls.client-auth-type=RequestClientCert],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},ContainerPort{Name:public,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rbac,ReadOnly:true,MountPath:/etc/lokistack-gateway/rbac.yaml,SubPath:rbac.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tenants,ReadOnly:true,MountPath:/etc/lokistack-gateway/tenants.yaml,SubPath:tenants.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lokistack-gateway,ReadOnly:true,MountPath:/etc/lokistack-gateway/lokistack-gateway.rego,SubPath:lokistack-gateway.rego,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-secret,ReadOnly:true,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-client-http,ReadOnly:true,MountPath:/var/run/tls/http/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-ca-bundle,ReadOnly:false,MountPath:/var/run/tenants-ca/cloudkitty,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dbnt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/live,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:12,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj_openstack(2954a978-cc4d-4e5a-95af-d3bab9a9b3d1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.851412 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" podUID="2954a978-cc4d-4e5a-95af-d3bab9a9b3d1" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.955106 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2b491fcb180423632d30811515a439a7a7f41023c1cfe4780647f18969b85a1d" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.956012 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-querier,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2b491fcb180423632d30811515a439a7a7f41023c1cfe4780647f18969b85a1d,Command:[],Args:[-target=querier -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zl8x7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-querier-795fd8f8cc-vq9hr_openstack(e528e0c0-c547-4d1d-8624-f8b2c8d450cf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:32:25 crc kubenswrapper[4914]: E0130 21:32:25.957308 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" podUID="e528e0c0-c547-4d1d-8624-f8b2c8d450cf" Jan 30 21:32:26 crc kubenswrapper[4914]: E0130 21:32:26.517644 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:74d61619b9420655da84bc9939e37f76040b437a70e9c96eeb3267f00dfe88ad\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" podUID="2954a978-cc4d-4e5a-95af-d3bab9a9b3d1" Jan 30 21:32:26 crc kubenswrapper[4914]: I0130 21:32:26.622491 4914 scope.go:117] "RemoveContainer" containerID="2cfbdc0f7e98bbbad96f0c6672a4c30d60ec519b33ec2740fc0be4311c5bd174" Jan 30 21:32:27 crc kubenswrapper[4914]: I0130 21:32:27.168768 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-q69cm"] Jan 30 21:32:27 crc kubenswrapper[4914]: I0130 21:32:27.263938 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-68th2"] Jan 30 21:32:27 crc kubenswrapper[4914]: E0130 21:32:27.521579 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Jan 30 21:32:27 crc kubenswrapper[4914]: E0130 21:32:27.521840 4914 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Jan 30 21:32:27 crc kubenswrapper[4914]: E0130 21:32:27.521962 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-knszm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(134b35c4-3656-4890-8cb2-76bc09779403): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:32:27 crc kubenswrapper[4914]: E0130 21:32:27.523431 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="134b35c4-3656-4890-8cb2-76bc09779403" Jan 30 21:32:27 crc kubenswrapper[4914]: I0130 21:32:27.525430 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rdzm9" event={"ID":"3f063a16-987d-4378-b889-966755034c3e","Type":"ContainerStarted","Data":"ec3927b68389ce8840b039d5728fafaacb72d60e2f6f6c39ac563dff37a8d98a"} Jan 30 21:32:27 crc kubenswrapper[4914]: I0130 21:32:27.525679 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-rdzm9" Jan 30 21:32:27 crc kubenswrapper[4914]: I0130 21:32:27.550675 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rdzm9" podStartSLOduration=11.668237406 podStartE2EDuration="30.550653436s" podCreationTimestamp="2026-01-30 21:31:57 +0000 UTC" firstStartedPulling="2026-01-30 21:32:07.646958995 +0000 UTC m=+1061.085595746" lastFinishedPulling="2026-01-30 21:32:26.529375015 +0000 UTC m=+1079.968011776" observedRunningTime="2026-01-30 21:32:27.549778565 +0000 UTC m=+1080.988415326" watchObservedRunningTime="2026-01-30 21:32:27.550653436 +0000 UTC m=+1080.989290197" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.537058 4914 generic.go:334] "Generic (PLEG): container finished" podID="78a46162-d658-4f7b-8643-1ffca846a558" containerID="e7738e2c4b00ca0771a41133f6ccd7989b5ce5c1fd493ab27713ff4ad8919349" exitCode=0 Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.537256 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" event={"ID":"78a46162-d658-4f7b-8643-1ffca846a558","Type":"ContainerDied","Data":"e7738e2c4b00ca0771a41133f6ccd7989b5ce5c1fd493ab27713ff4ad8919349"} Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.537569 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" event={"ID":"78a46162-d658-4f7b-8643-1ffca846a558","Type":"ContainerStarted","Data":"cbccbc5d9e1345b9cb2c9a2a34cd5ce531902f5ff576a13370e95bd61f68df14"} Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.540115 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-68th2" event={"ID":"c819be77-2b86-4bbf-9e4b-f9738f59032d","Type":"ContainerStarted","Data":"d9bb68278b600162c5e62643c21114b4657030d70bd425679e93e3c23361e6f0"} Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.542904 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"555d8330-2863-4fe8-96b8-2a751de6569d","Type":"ContainerStarted","Data":"082003ca5a86d97439ba554d9844042fef0a6edf406d7d8f1179c40d9ebfe2ae"} Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.544833 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" event={"ID":"915fbbd9-20c9-4552-bf18-a61af008b1d8","Type":"ContainerStarted","Data":"a4cfa83b5aa769c1a6de472d80a75728de3e563fca4ff856d1ecf8cdae8b79d3"} Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.545176 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.567131 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0","Type":"ContainerStarted","Data":"a66617fe04c2995bdf069364e2fef0626abcf55787d3a14ce86ac69506d50116"} Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.567834 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.578784 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" event={"ID":"844bac7f-9f50-49c2-a05c-963b99ca4490","Type":"ContainerStarted","Data":"40a5a9633ccd5c778ee72f326cbabdc033f6595cdd4e43280181f058b9517ad1"} Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.580075 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.581942 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5","Type":"ContainerStarted","Data":"18d22e5e6966074eb35cb9eac037e5b298f9b653ac01e01e1a332474b88a4ba5"} Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.582043 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.587872 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"abe9f42c-7055-4099-ad8e-f827973007cd","Type":"ContainerStarted","Data":"0b8847dbb59b3a4b109074fccd9442b4f8abae51099509ab1e08a9956d6ced62"} Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.597479 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.606391 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" podStartSLOduration=7.386037503 podStartE2EDuration="26.60636557s" podCreationTimestamp="2026-01-30 21:32:02 +0000 UTC" firstStartedPulling="2026-01-30 21:32:07.64927492 +0000 UTC m=+1061.087911671" lastFinishedPulling="2026-01-30 21:32:26.869602977 +0000 UTC m=+1080.308239738" observedRunningTime="2026-01-30 21:32:28.584766043 +0000 UTC m=+1082.023402804" watchObservedRunningTime="2026-01-30 21:32:28.60636557 +0000 UTC m=+1082.045002331" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.606880 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"1cd64ca8-c110-4af1-ad2e-edbed561a3b3","Type":"ContainerStarted","Data":"93de60d06de37143dd330a98338266e2916b964fe9b37510f29ee0b89f52673d"} Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.607203 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.613165 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" event={"ID":"e528e0c0-c547-4d1d-8624-f8b2c8d450cf","Type":"ContainerStarted","Data":"e7d597a0223566ca9beeb8fbd19adbaac3690e9c652421b47fa4e70d0169c5cb"} Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.613436 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.615590 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" event={"ID":"c2060bc5-fb2c-4421-b6a0-7acbd5549c8d","Type":"ContainerStarted","Data":"810574558be444c6331aeda116dced0a3800df760efdbc144211e1e30dd6a94d"} Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.616392 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.618879 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0","Type":"ContainerStarted","Data":"fb14b8d19e535890f140034cd0934b774c45d18f035f753cd8ef7545abd3fe8c"} Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.619477 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.629311 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv" podStartSLOduration=6.56247448 podStartE2EDuration="25.629290898s" podCreationTimestamp="2026-01-30 21:32:03 +0000 UTC" firstStartedPulling="2026-01-30 21:32:07.644384393 +0000 UTC m=+1061.083021154" lastFinishedPulling="2026-01-30 21:32:26.711200811 +0000 UTC m=+1080.149837572" observedRunningTime="2026-01-30 21:32:28.603197914 +0000 UTC m=+1082.041834695" watchObservedRunningTime="2026-01-30 21:32:28.629290898 +0000 UTC m=+1082.067927659" Jan 30 21:32:28 crc kubenswrapper[4914]: E0130 21:32:28.635796 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="134b35c4-3656-4890-8cb2-76bc09779403" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.649593 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.902624368 podStartE2EDuration="38.649567492s" podCreationTimestamp="2026-01-30 21:31:50 +0000 UTC" firstStartedPulling="2026-01-30 21:32:06.54561303 +0000 UTC m=+1059.984249791" lastFinishedPulling="2026-01-30 21:32:26.292556144 +0000 UTC m=+1079.731192915" observedRunningTime="2026-01-30 21:32:28.632761251 +0000 UTC m=+1082.071398012" watchObservedRunningTime="2026-01-30 21:32:28.649567492 +0000 UTC m=+1082.088204253" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.652545 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=6.5980222 podStartE2EDuration="25.652526643s" podCreationTimestamp="2026-01-30 21:32:03 +0000 UTC" firstStartedPulling="2026-01-30 21:32:07.253327706 +0000 UTC m=+1060.691964467" lastFinishedPulling="2026-01-30 21:32:26.307832139 +0000 UTC m=+1079.746468910" observedRunningTime="2026-01-30 21:32:28.652278857 +0000 UTC m=+1082.090915638" watchObservedRunningTime="2026-01-30 21:32:28.652526643 +0000 UTC m=+1082.091163404" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.687498 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=6.188326287 podStartE2EDuration="25.687475828s" podCreationTimestamp="2026-01-30 21:32:03 +0000 UTC" firstStartedPulling="2026-01-30 21:32:07.142510497 +0000 UTC m=+1060.581147258" lastFinishedPulling="2026-01-30 21:32:26.641660038 +0000 UTC m=+1080.080296799" observedRunningTime="2026-01-30 21:32:28.681580848 +0000 UTC m=+1082.120217609" watchObservedRunningTime="2026-01-30 21:32:28.687475828 +0000 UTC m=+1082.126112589" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.707769 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" podStartSLOduration=6.055255916 podStartE2EDuration="25.707754873s" podCreationTimestamp="2026-01-30 21:32:03 +0000 UTC" firstStartedPulling="2026-01-30 21:32:07.217267114 +0000 UTC m=+1060.655903875" lastFinishedPulling="2026-01-30 21:32:26.869766071 +0000 UTC m=+1080.308402832" observedRunningTime="2026-01-30 21:32:28.701188496 +0000 UTC m=+1082.139825257" watchObservedRunningTime="2026-01-30 21:32:28.707754873 +0000 UTC m=+1082.146391634" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.725586 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=6.958723899 podStartE2EDuration="26.725568129s" podCreationTimestamp="2026-01-30 21:32:02 +0000 UTC" firstStartedPulling="2026-01-30 21:32:07.262036584 +0000 UTC m=+1060.700673345" lastFinishedPulling="2026-01-30 21:32:27.028880824 +0000 UTC m=+1080.467517575" observedRunningTime="2026-01-30 21:32:28.717024925 +0000 UTC m=+1082.155661686" watchObservedRunningTime="2026-01-30 21:32:28.725568129 +0000 UTC m=+1082.164204900" Jan 30 21:32:28 crc kubenswrapper[4914]: I0130 21:32:28.781214 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" podStartSLOduration=-9223372011.073578 podStartE2EDuration="25.781198869s" podCreationTimestamp="2026-01-30 21:32:03 +0000 UTC" firstStartedPulling="2026-01-30 21:32:07.637403066 +0000 UTC m=+1061.076039827" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:32:28.762331608 +0000 UTC m=+1082.200968369" watchObservedRunningTime="2026-01-30 21:32:28.781198869 +0000 UTC m=+1082.219835620" Jan 30 21:32:29 crc kubenswrapper[4914]: I0130 21:32:29.640766 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" event={"ID":"78a46162-d658-4f7b-8643-1ffca846a558","Type":"ContainerStarted","Data":"ad7d3a229a9168cfbdcbb320a163b78af38f35ef8b96c2794b04e95375385f73"} Jan 30 21:32:29 crc kubenswrapper[4914]: I0130 21:32:29.643325 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:30 crc kubenswrapper[4914]: I0130 21:32:30.653866 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"555d8330-2863-4fe8-96b8-2a751de6569d","Type":"ContainerStarted","Data":"4eec434f039b9bc21fe81c21444b686ca405c91e5f9782f323d156afb357c3be"} Jan 30 21:32:30 crc kubenswrapper[4914]: I0130 21:32:30.656837 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-68th2" event={"ID":"c819be77-2b86-4bbf-9e4b-f9738f59032d","Type":"ContainerStarted","Data":"ce3925f14b16562ea515efbe15cb23c137a5db2dfc596172385e24c53eadc30c"} Jan 30 21:32:30 crc kubenswrapper[4914]: I0130 21:32:30.666063 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"abe9f42c-7055-4099-ad8e-f827973007cd","Type":"ContainerStarted","Data":"e7788d2e869fb31180afd5fbce2c0f46a8ff49a4ca9bf0669209e4f937efd14d"} Jan 30 21:32:30 crc kubenswrapper[4914]: I0130 21:32:30.685088 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" podStartSLOduration=9.685054134 podStartE2EDuration="9.685054134s" podCreationTimestamp="2026-01-30 21:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:32:29.670946945 +0000 UTC m=+1083.109583696" watchObservedRunningTime="2026-01-30 21:32:30.685054134 +0000 UTC m=+1084.123690925" Jan 30 21:32:30 crc kubenswrapper[4914]: I0130 21:32:30.693494 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.319691276 podStartE2EDuration="31.693468126s" podCreationTimestamp="2026-01-30 21:31:59 +0000 UTC" firstStartedPulling="2026-01-30 21:32:12.507634725 +0000 UTC m=+1065.946271476" lastFinishedPulling="2026-01-30 21:32:29.881411565 +0000 UTC m=+1083.320048326" observedRunningTime="2026-01-30 21:32:30.679100592 +0000 UTC m=+1084.117737423" watchObservedRunningTime="2026-01-30 21:32:30.693468126 +0000 UTC m=+1084.132104917" Jan 30 21:32:30 crc kubenswrapper[4914]: I0130 21:32:30.712891 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.305859934 podStartE2EDuration="34.712867319s" podCreationTimestamp="2026-01-30 21:31:56 +0000 UTC" firstStartedPulling="2026-01-30 21:32:12.507939672 +0000 UTC m=+1065.946576433" lastFinishedPulling="2026-01-30 21:32:29.914947057 +0000 UTC m=+1083.353583818" observedRunningTime="2026-01-30 21:32:30.709265233 +0000 UTC m=+1084.147902034" watchObservedRunningTime="2026-01-30 21:32:30.712867319 +0000 UTC m=+1084.151504120" Jan 30 21:32:30 crc kubenswrapper[4914]: I0130 21:32:30.737284 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-68th2" podStartSLOduration=7.4386321 podStartE2EDuration="9.737264642s" podCreationTimestamp="2026-01-30 21:32:21 +0000 UTC" firstStartedPulling="2026-01-30 21:32:27.573761018 +0000 UTC m=+1081.012397779" lastFinishedPulling="2026-01-30 21:32:29.87239356 +0000 UTC m=+1083.311030321" observedRunningTime="2026-01-30 21:32:30.732456298 +0000 UTC m=+1084.171093119" watchObservedRunningTime="2026-01-30 21:32:30.737264642 +0000 UTC m=+1084.175901413" Jan 30 21:32:31 crc kubenswrapper[4914]: I0130 21:32:31.280310 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:31 crc kubenswrapper[4914]: I0130 21:32:31.280650 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:31 crc kubenswrapper[4914]: I0130 21:32:31.326590 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:32 crc kubenswrapper[4914]: I0130 21:32:32.403777 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 30 21:32:33 crc kubenswrapper[4914]: I0130 21:32:33.404837 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 30 21:32:33 crc kubenswrapper[4914]: I0130 21:32:33.458552 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 30 21:32:33 crc kubenswrapper[4914]: I0130 21:32:33.734390 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 30 21:32:35 crc kubenswrapper[4914]: I0130 21:32:35.704349 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.325283 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.578368 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.580346 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.586810 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.619292 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.619636 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.619883 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-rl7ff" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.623080 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.642553 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/91e45099-57bd-49e7-aa99-5e11b711ec92-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.642604 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91e45099-57bd-49e7-aa99-5e11b711ec92-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.642628 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91e45099-57bd-49e7-aa99-5e11b711ec92-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.642649 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdnbb\" (UniqueName: \"kubernetes.io/projected/91e45099-57bd-49e7-aa99-5e11b711ec92-kube-api-access-fdnbb\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.642692 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91e45099-57bd-49e7-aa99-5e11b711ec92-config\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.642725 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91e45099-57bd-49e7-aa99-5e11b711ec92-scripts\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.642748 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/91e45099-57bd-49e7-aa99-5e11b711ec92-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.735051 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"46107121-a72c-40a7-904c-24c6c33de7c4","Type":"ContainerStarted","Data":"97cb01ef00268ba18335e92f9773acb46197de33a7140a365dfda8c711296eb9"} Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.735531 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.737567 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c506e0ae-e4b2-4cd7-87ea-bc10619f874e","Type":"ContainerStarted","Data":"004c6c908d0d5695cba3e148480eb2debb05fb64209a0e7961d73f9232c504b0"} Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.744527 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/91e45099-57bd-49e7-aa99-5e11b711ec92-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.744878 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91e45099-57bd-49e7-aa99-5e11b711ec92-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.745006 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91e45099-57bd-49e7-aa99-5e11b711ec92-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.745128 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdnbb\" (UniqueName: \"kubernetes.io/projected/91e45099-57bd-49e7-aa99-5e11b711ec92-kube-api-access-fdnbb\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.745293 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91e45099-57bd-49e7-aa99-5e11b711ec92-config\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.745409 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91e45099-57bd-49e7-aa99-5e11b711ec92-scripts\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.745534 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/91e45099-57bd-49e7-aa99-5e11b711ec92-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.747153 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91e45099-57bd-49e7-aa99-5e11b711ec92-config\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.747882 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91e45099-57bd-49e7-aa99-5e11b711ec92-scripts\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.748671 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/91e45099-57bd-49e7-aa99-5e11b711ec92-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.749126 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91e45099-57bd-49e7-aa99-5e11b711ec92-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.751946 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/91e45099-57bd-49e7-aa99-5e11b711ec92-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.754796 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91e45099-57bd-49e7-aa99-5e11b711ec92-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.766493 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdnbb\" (UniqueName: \"kubernetes.io/projected/91e45099-57bd-49e7-aa99-5e11b711ec92-kube-api-access-fdnbb\") pod \"ovn-northd-0\" (UID: \"91e45099-57bd-49e7-aa99-5e11b711ec92\") " pod="openstack/ovn-northd-0" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.802834 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rnjqw"] Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.803090 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" podUID="a6edeafb-6617-4058-9b35-bf0bb078ceba" containerName="dnsmasq-dns" containerID="cri-o://7b38ff50c0e3fb47d27d5a3de46fcdbdf0923826b3487adc61fa3506e24def1c" gracePeriod=10 Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.821871 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" podUID="a6edeafb-6617-4058-9b35-bf0bb078ceba" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.106:5353: connect: connection refused" Jan 30 21:32:36 crc kubenswrapper[4914]: I0130 21:32:36.937073 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.338257 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.461009 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6edeafb-6617-4058-9b35-bf0bb078ceba-config\") pod \"a6edeafb-6617-4058-9b35-bf0bb078ceba\" (UID: \"a6edeafb-6617-4058-9b35-bf0bb078ceba\") " Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.461152 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-658x5\" (UniqueName: \"kubernetes.io/projected/a6edeafb-6617-4058-9b35-bf0bb078ceba-kube-api-access-658x5\") pod \"a6edeafb-6617-4058-9b35-bf0bb078ceba\" (UID: \"a6edeafb-6617-4058-9b35-bf0bb078ceba\") " Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.461230 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6edeafb-6617-4058-9b35-bf0bb078ceba-dns-svc\") pod \"a6edeafb-6617-4058-9b35-bf0bb078ceba\" (UID: \"a6edeafb-6617-4058-9b35-bf0bb078ceba\") " Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.471783 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6edeafb-6617-4058-9b35-bf0bb078ceba-kube-api-access-658x5" (OuterVolumeSpecName: "kube-api-access-658x5") pod "a6edeafb-6617-4058-9b35-bf0bb078ceba" (UID: "a6edeafb-6617-4058-9b35-bf0bb078ceba"). InnerVolumeSpecName "kube-api-access-658x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.474063 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 21:32:37 crc kubenswrapper[4914]: W0130 21:32:37.476812 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91e45099_57bd_49e7_aa99_5e11b711ec92.slice/crio-aa58725fc3a998e7c37967580fe851e555071c02d782490581c98b55fe7db2b8 WatchSource:0}: Error finding container aa58725fc3a998e7c37967580fe851e555071c02d782490581c98b55fe7db2b8: Status 404 returned error can't find the container with id aa58725fc3a998e7c37967580fe851e555071c02d782490581c98b55fe7db2b8 Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.563799 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-658x5\" (UniqueName: \"kubernetes.io/projected/a6edeafb-6617-4058-9b35-bf0bb078ceba-kube-api-access-658x5\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.590413 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6edeafb-6617-4058-9b35-bf0bb078ceba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6edeafb-6617-4058-9b35-bf0bb078ceba" (UID: "a6edeafb-6617-4058-9b35-bf0bb078ceba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.618604 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6edeafb-6617-4058-9b35-bf0bb078ceba-config" (OuterVolumeSpecName: "config") pod "a6edeafb-6617-4058-9b35-bf0bb078ceba" (UID: "a6edeafb-6617-4058-9b35-bf0bb078ceba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.665603 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6edeafb-6617-4058-9b35-bf0bb078ceba-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.665639 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6edeafb-6617-4058-9b35-bf0bb078ceba-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.780008 4914 generic.go:334] "Generic (PLEG): container finished" podID="a6edeafb-6617-4058-9b35-bf0bb078ceba" containerID="7b38ff50c0e3fb47d27d5a3de46fcdbdf0923826b3487adc61fa3506e24def1c" exitCode=0 Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.780063 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.780085 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" event={"ID":"a6edeafb-6617-4058-9b35-bf0bb078ceba","Type":"ContainerDied","Data":"7b38ff50c0e3fb47d27d5a3de46fcdbdf0923826b3487adc61fa3506e24def1c"} Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.780573 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rnjqw" event={"ID":"a6edeafb-6617-4058-9b35-bf0bb078ceba","Type":"ContainerDied","Data":"73037bf5311c6edc31d17452cc77921416acdeb8183b92775f460648d8b74a3e"} Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.780612 4914 scope.go:117] "RemoveContainer" containerID="7b38ff50c0e3fb47d27d5a3de46fcdbdf0923826b3487adc61fa3506e24def1c" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.786046 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f394410a-5ff7-4a0c-84ec-4b60c63c707c","Type":"ContainerStarted","Data":"f8343c308380c5164c5ade6d747612b6694879f8d31de8cbd0fbfc60d77d1c07"} Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.788308 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"91e45099-57bd-49e7-aa99-5e11b711ec92","Type":"ContainerStarted","Data":"aa58725fc3a998e7c37967580fe851e555071c02d782490581c98b55fe7db2b8"} Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.807521 4914 scope.go:117] "RemoveContainer" containerID="28b4e1bf009dd88101a6f8c6c40d3be9aceb260ab718b14aabf6b407e2c7c135" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.844849 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rnjqw"] Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.845893 4914 scope.go:117] "RemoveContainer" containerID="7b38ff50c0e3fb47d27d5a3de46fcdbdf0923826b3487adc61fa3506e24def1c" Jan 30 21:32:37 crc kubenswrapper[4914]: E0130 21:32:37.846410 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b38ff50c0e3fb47d27d5a3de46fcdbdf0923826b3487adc61fa3506e24def1c\": container with ID starting with 7b38ff50c0e3fb47d27d5a3de46fcdbdf0923826b3487adc61fa3506e24def1c not found: ID does not exist" containerID="7b38ff50c0e3fb47d27d5a3de46fcdbdf0923826b3487adc61fa3506e24def1c" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.846453 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b38ff50c0e3fb47d27d5a3de46fcdbdf0923826b3487adc61fa3506e24def1c"} err="failed to get container status \"7b38ff50c0e3fb47d27d5a3de46fcdbdf0923826b3487adc61fa3506e24def1c\": rpc error: code = NotFound desc = could not find container \"7b38ff50c0e3fb47d27d5a3de46fcdbdf0923826b3487adc61fa3506e24def1c\": container with ID starting with 7b38ff50c0e3fb47d27d5a3de46fcdbdf0923826b3487adc61fa3506e24def1c not found: ID does not exist" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.846497 4914 scope.go:117] "RemoveContainer" containerID="28b4e1bf009dd88101a6f8c6c40d3be9aceb260ab718b14aabf6b407e2c7c135" Jan 30 21:32:37 crc kubenswrapper[4914]: E0130 21:32:37.847055 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28b4e1bf009dd88101a6f8c6c40d3be9aceb260ab718b14aabf6b407e2c7c135\": container with ID starting with 28b4e1bf009dd88101a6f8c6c40d3be9aceb260ab718b14aabf6b407e2c7c135 not found: ID does not exist" containerID="28b4e1bf009dd88101a6f8c6c40d3be9aceb260ab718b14aabf6b407e2c7c135" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.847091 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28b4e1bf009dd88101a6f8c6c40d3be9aceb260ab718b14aabf6b407e2c7c135"} err="failed to get container status \"28b4e1bf009dd88101a6f8c6c40d3be9aceb260ab718b14aabf6b407e2c7c135\": rpc error: code = NotFound desc = could not find container \"28b4e1bf009dd88101a6f8c6c40d3be9aceb260ab718b14aabf6b407e2c7c135\": container with ID starting with 28b4e1bf009dd88101a6f8c6c40d3be9aceb260ab718b14aabf6b407e2c7c135 not found: ID does not exist" Jan 30 21:32:37 crc kubenswrapper[4914]: I0130 21:32:37.853738 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rnjqw"] Jan 30 21:32:39 crc kubenswrapper[4914]: I0130 21:32:39.812366 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"91e45099-57bd-49e7-aa99-5e11b711ec92","Type":"ContainerStarted","Data":"dde284ac96f0016a8782f245bbc4fe633a30b3dd906cce92db94de6faead2e71"} Jan 30 21:32:39 crc kubenswrapper[4914]: I0130 21:32:39.812871 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 30 21:32:39 crc kubenswrapper[4914]: I0130 21:32:39.812893 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"91e45099-57bd-49e7-aa99-5e11b711ec92","Type":"ContainerStarted","Data":"5daa49367ddf9f2b38b2e2a569efed997914321c06a6dd910e5ce06ab582209c"} Jan 30 21:32:39 crc kubenswrapper[4914]: I0130 21:32:39.816253 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"63625a35-5028-4dda-b9b3-ec3910fd8385","Type":"ContainerStarted","Data":"70ae094798887e399e9d1376ebead63589b67a00897f0311ff7cd8b6886df89e"} Jan 30 21:32:39 crc kubenswrapper[4914]: I0130 21:32:39.869390 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.533334815 podStartE2EDuration="3.869368979s" podCreationTimestamp="2026-01-30 21:32:36 +0000 UTC" firstStartedPulling="2026-01-30 21:32:37.478762528 +0000 UTC m=+1090.917399289" lastFinishedPulling="2026-01-30 21:32:38.814796692 +0000 UTC m=+1092.253433453" observedRunningTime="2026-01-30 21:32:39.843746747 +0000 UTC m=+1093.282383548" watchObservedRunningTime="2026-01-30 21:32:39.869368979 +0000 UTC m=+1093.308005750" Jan 30 21:32:39 crc kubenswrapper[4914]: I0130 21:32:39.875294 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6edeafb-6617-4058-9b35-bf0bb078ceba" path="/var/lib/kubelet/pods/a6edeafb-6617-4058-9b35-bf0bb078ceba/volumes" Jan 30 21:32:40 crc kubenswrapper[4914]: I0130 21:32:40.826853 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"da3bc7da-e810-4d0a-a7df-792c544f3a23","Type":"ContainerStarted","Data":"7365bafaa4e248fb75478bef8bf8b53175508fa5c7cdcf5bfda3b820eee56e1f"} Jan 30 21:32:40 crc kubenswrapper[4914]: I0130 21:32:40.828439 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" event={"ID":"2954a978-cc4d-4e5a-95af-d3bab9a9b3d1","Type":"ContainerStarted","Data":"129ff2bafaba15e122879bed66da7f04dd9ba9c52ecd56db5721ce942168f0b1"} Jan 30 21:32:40 crc kubenswrapper[4914]: I0130 21:32:40.828737 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:40 crc kubenswrapper[4914]: I0130 21:32:40.829622 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kv2g9" event={"ID":"11cefee1-f5e9-4f79-b25b-8dae49655475","Type":"ContainerStarted","Data":"9fee150b04b8a556e2e486cf4f01cd96cef4eb3a3e041d2bceee348f10c5d82b"} Jan 30 21:32:40 crc kubenswrapper[4914]: I0130 21:32:40.844837 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" Jan 30 21:32:40 crc kubenswrapper[4914]: I0130 21:32:40.909148 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj" podStartSLOduration=-9223371998.945648 podStartE2EDuration="37.909128371s" podCreationTimestamp="2026-01-30 21:32:03 +0000 UTC" firstStartedPulling="2026-01-30 21:32:07.078785924 +0000 UTC m=+1060.517422685" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:32:40.871199734 +0000 UTC m=+1094.309836495" watchObservedRunningTime="2026-01-30 21:32:40.909128371 +0000 UTC m=+1094.347765132" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.740117 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-5g9tf"] Jan 30 21:32:42 crc kubenswrapper[4914]: E0130 21:32:42.741650 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6edeafb-6617-4058-9b35-bf0bb078ceba" containerName="init" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.741743 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6edeafb-6617-4058-9b35-bf0bb078ceba" containerName="init" Jan 30 21:32:42 crc kubenswrapper[4914]: E0130 21:32:42.741837 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6edeafb-6617-4058-9b35-bf0bb078ceba" containerName="dnsmasq-dns" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.741890 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6edeafb-6617-4058-9b35-bf0bb078ceba" containerName="dnsmasq-dns" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.742105 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6edeafb-6617-4058-9b35-bf0bb078ceba" containerName="dnsmasq-dns" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.743051 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.759541 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5g9tf"] Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.845047 4914 generic.go:334] "Generic (PLEG): container finished" podID="46107121-a72c-40a7-904c-24c6c33de7c4" containerID="97cb01ef00268ba18335e92f9773acb46197de33a7140a365dfda8c711296eb9" exitCode=0 Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.845287 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"46107121-a72c-40a7-904c-24c6c33de7c4","Type":"ContainerDied","Data":"97cb01ef00268ba18335e92f9773acb46197de33a7140a365dfda8c711296eb9"} Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.868763 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.868830 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-config\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.868897 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-dns-svc\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.868929 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlfcb\" (UniqueName: \"kubernetes.io/projected/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-kube-api-access-vlfcb\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.869020 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.971024 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-dns-svc\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.971061 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlfcb\" (UniqueName: \"kubernetes.io/projected/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-kube-api-access-vlfcb\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.971214 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.971238 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.971294 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-config\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.972286 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.972502 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.972634 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-config\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:42 crc kubenswrapper[4914]: I0130 21:32:42.973317 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-dns-svc\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.076287 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlfcb\" (UniqueName: \"kubernetes.io/projected/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-kube-api-access-vlfcb\") pod \"dnsmasq-dns-698758b865-5g9tf\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.210435 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-5wh44" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.358253 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.419326 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-vq9hr" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.490041 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.729131 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5g9tf"] Jan 30 21:32:43 crc kubenswrapper[4914]: W0130 21:32:43.735534 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a72c047_4bec_41a0_bcd6_6002e9fb8dbe.slice/crio-956e047291c1480d721f1809a6e625fa125c395fe2436f8b76d532410a234118 WatchSource:0}: Error finding container 956e047291c1480d721f1809a6e625fa125c395fe2436f8b76d532410a234118: Status 404 returned error can't find the container with id 956e047291c1480d721f1809a6e625fa125c395fe2436f8b76d532410a234118 Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.790432 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.801143 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.823030 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.823576 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.823751 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.834500 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-75c8n" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.875612 4914 generic.go:334] "Generic (PLEG): container finished" podID="11cefee1-f5e9-4f79-b25b-8dae49655475" containerID="9fee150b04b8a556e2e486cf4f01cd96cef4eb3a3e041d2bceee348f10c5d82b" exitCode=0 Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.875687 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kv2g9" event={"ID":"11cefee1-f5e9-4f79-b25b-8dae49655475","Type":"ContainerDied","Data":"9fee150b04b8a556e2e486cf4f01cd96cef4eb3a3e041d2bceee348f10c5d82b"} Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.877811 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5g9tf" event={"ID":"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe","Type":"ContainerStarted","Data":"956e047291c1480d721f1809a6e625fa125c395fe2436f8b76d532410a234118"} Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.880418 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.888788 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtc7z\" (UniqueName: \"kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-kube-api-access-wtc7z\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.888844 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9bb27911-3e75-4b0b-9702-70ac91e2c993\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9bb27911-3e75-4b0b-9702-70ac91e2c993\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.888896 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3a754950-b587-4c0a-85ed-e9669582ea2c-lock\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.888923 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a754950-b587-4c0a-85ed-e9669582ea2c-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.888955 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.889000 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3a754950-b587-4c0a-85ed-e9669582ea2c-cache\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.898428 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"134b35c4-3656-4890-8cb2-76bc09779403","Type":"ContainerStarted","Data":"47c9c3cf033214c46eb2573058e39b836cf81ac9dfe8ebc208f5186a76e60b4c"} Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.899116 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.932951 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.995824493 podStartE2EDuration="51.932908636s" podCreationTimestamp="2026-01-30 21:31:52 +0000 UTC" firstStartedPulling="2026-01-30 21:32:07.252840604 +0000 UTC m=+1060.691477355" lastFinishedPulling="2026-01-30 21:32:43.189924737 +0000 UTC m=+1096.628561498" observedRunningTime="2026-01-30 21:32:43.916623256 +0000 UTC m=+1097.355260017" watchObservedRunningTime="2026-01-30 21:32:43.932908636 +0000 UTC m=+1097.371545397" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.990736 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtc7z\" (UniqueName: \"kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-kube-api-access-wtc7z\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.990783 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9bb27911-3e75-4b0b-9702-70ac91e2c993\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9bb27911-3e75-4b0b-9702-70ac91e2c993\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.990830 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3a754950-b587-4c0a-85ed-e9669582ea2c-lock\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.990854 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a754950-b587-4c0a-85ed-e9669582ea2c-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.990894 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.990933 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3a754950-b587-4c0a-85ed-e9669582ea2c-cache\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: E0130 21:32:43.991399 4914 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 21:32:43 crc kubenswrapper[4914]: E0130 21:32:43.991418 4914 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 21:32:43 crc kubenswrapper[4914]: E0130 21:32:43.991459 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift podName:3a754950-b587-4c0a-85ed-e9669582ea2c nodeName:}" failed. No retries permitted until 2026-01-30 21:32:44.491444095 +0000 UTC m=+1097.930080846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift") pod "swift-storage-0" (UID: "3a754950-b587-4c0a-85ed-e9669582ea2c") : configmap "swift-ring-files" not found Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.991664 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3a754950-b587-4c0a-85ed-e9669582ea2c-cache\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.992830 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3a754950-b587-4c0a-85ed-e9669582ea2c-lock\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.998186 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.998368 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9bb27911-3e75-4b0b-9702-70ac91e2c993\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9bb27911-3e75-4b0b-9702-70ac91e2c993\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/98ce50ad4f214042a4a638cdb396eb7c9cc5002d3916a8137932ecfd93467a94/globalmount\"" pod="openstack/swift-storage-0" Jan 30 21:32:43 crc kubenswrapper[4914]: I0130 21:32:43.998314 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a754950-b587-4c0a-85ed-e9669582ea2c-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:44 crc kubenswrapper[4914]: I0130 21:32:44.034859 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtc7z\" (UniqueName: \"kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-kube-api-access-wtc7z\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:44 crc kubenswrapper[4914]: I0130 21:32:44.040298 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9bb27911-3e75-4b0b-9702-70ac91e2c993\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9bb27911-3e75-4b0b-9702-70ac91e2c993\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:44 crc kubenswrapper[4914]: I0130 21:32:44.354111 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 30 21:32:44 crc kubenswrapper[4914]: I0130 21:32:44.499876 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:44 crc kubenswrapper[4914]: E0130 21:32:44.500134 4914 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 21:32:44 crc kubenswrapper[4914]: E0130 21:32:44.500168 4914 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 21:32:44 crc kubenswrapper[4914]: E0130 21:32:44.500242 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift podName:3a754950-b587-4c0a-85ed-e9669582ea2c nodeName:}" failed. No retries permitted until 2026-01-30 21:32:45.500219256 +0000 UTC m=+1098.938856017 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift") pod "swift-storage-0" (UID: "3a754950-b587-4c0a-85ed-e9669582ea2c") : configmap "swift-ring-files" not found Jan 30 21:32:44 crc kubenswrapper[4914]: I0130 21:32:44.502989 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 30 21:32:44 crc kubenswrapper[4914]: I0130 21:32:44.540147 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 30 21:32:44 crc kubenswrapper[4914]: I0130 21:32:44.912620 4914 generic.go:334] "Generic (PLEG): container finished" podID="9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" containerID="dfe8b844069b1515bd0bf43bcc4923434b172851daaaceab570e16acbaea9efd" exitCode=0 Jan 30 21:32:44 crc kubenswrapper[4914]: I0130 21:32:44.912678 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5g9tf" event={"ID":"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe","Type":"ContainerDied","Data":"dfe8b844069b1515bd0bf43bcc4923434b172851daaaceab570e16acbaea9efd"} Jan 30 21:32:44 crc kubenswrapper[4914]: I0130 21:32:44.919085 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kv2g9" event={"ID":"11cefee1-f5e9-4f79-b25b-8dae49655475","Type":"ContainerStarted","Data":"4bb53e77fc32c87b8121d71f4a0c57724f1e43dcd247c2541784a8e419d707d6"} Jan 30 21:32:44 crc kubenswrapper[4914]: I0130 21:32:44.919173 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kv2g9" event={"ID":"11cefee1-f5e9-4f79-b25b-8dae49655475","Type":"ContainerStarted","Data":"b74a77bfa18c514847b6bf7ba1f3cc498b01eb39469e0d540e278893c4c221d2"} Jan 30 21:32:44 crc kubenswrapper[4914]: I0130 21:32:44.919471 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:32:44 crc kubenswrapper[4914]: I0130 21:32:44.919529 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:32:44 crc kubenswrapper[4914]: I0130 21:32:44.920422 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1","Type":"ContainerStarted","Data":"12aae14a6f79949923517112307201ed4c40a81d30ca3db9cb5e044d91f2e9ed"} Jan 30 21:32:44 crc kubenswrapper[4914]: I0130 21:32:44.956372 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-kv2g9" podStartSLOduration=15.155682432 podStartE2EDuration="47.956359028s" podCreationTimestamp="2026-01-30 21:31:57 +0000 UTC" firstStartedPulling="2026-01-30 21:32:07.504189382 +0000 UTC m=+1060.942826143" lastFinishedPulling="2026-01-30 21:32:40.304865978 +0000 UTC m=+1093.743502739" observedRunningTime="2026-01-30 21:32:44.953782847 +0000 UTC m=+1098.392419628" watchObservedRunningTime="2026-01-30 21:32:44.956359028 +0000 UTC m=+1098.394995789" Jan 30 21:32:45 crc kubenswrapper[4914]: I0130 21:32:45.521140 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:45 crc kubenswrapper[4914]: E0130 21:32:45.521310 4914 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 21:32:45 crc kubenswrapper[4914]: E0130 21:32:45.521685 4914 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 21:32:45 crc kubenswrapper[4914]: E0130 21:32:45.521756 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift podName:3a754950-b587-4c0a-85ed-e9669582ea2c nodeName:}" failed. No retries permitted until 2026-01-30 21:32:47.521742442 +0000 UTC m=+1100.960379203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift") pod "swift-storage-0" (UID: "3a754950-b587-4c0a-85ed-e9669582ea2c") : configmap "swift-ring-files" not found Jan 30 21:32:45 crc kubenswrapper[4914]: I0130 21:32:45.936017 4914 generic.go:334] "Generic (PLEG): container finished" podID="63625a35-5028-4dda-b9b3-ec3910fd8385" containerID="70ae094798887e399e9d1376ebead63589b67a00897f0311ff7cd8b6886df89e" exitCode=0 Jan 30 21:32:45 crc kubenswrapper[4914]: I0130 21:32:45.936087 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"63625a35-5028-4dda-b9b3-ec3910fd8385","Type":"ContainerDied","Data":"70ae094798887e399e9d1376ebead63589b67a00897f0311ff7cd8b6886df89e"} Jan 30 21:32:45 crc kubenswrapper[4914]: I0130 21:32:45.940269 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5g9tf" event={"ID":"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe","Type":"ContainerStarted","Data":"58d2ae20f52406209286753e4fda81fb98788d6f3f90414533d61c2e6ecab008"} Jan 30 21:32:45 crc kubenswrapper[4914]: I0130 21:32:45.940321 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:45 crc kubenswrapper[4914]: I0130 21:32:45.981555 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-5g9tf" podStartSLOduration=3.981533673 podStartE2EDuration="3.981533673s" podCreationTimestamp="2026-01-30 21:32:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:32:45.979218477 +0000 UTC m=+1099.417855248" watchObservedRunningTime="2026-01-30 21:32:45.981533673 +0000 UTC m=+1099.420170444" Jan 30 21:32:46 crc kubenswrapper[4914]: I0130 21:32:46.950900 4914 generic.go:334] "Generic (PLEG): container finished" podID="da3bc7da-e810-4d0a-a7df-792c544f3a23" containerID="7365bafaa4e248fb75478bef8bf8b53175508fa5c7cdcf5bfda3b820eee56e1f" exitCode=0 Jan 30 21:32:46 crc kubenswrapper[4914]: I0130 21:32:46.950999 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"da3bc7da-e810-4d0a-a7df-792c544f3a23","Type":"ContainerDied","Data":"7365bafaa4e248fb75478bef8bf8b53175508fa5c7cdcf5bfda3b820eee56e1f"} Jan 30 21:32:46 crc kubenswrapper[4914]: I0130 21:32:46.954522 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"46107121-a72c-40a7-904c-24c6c33de7c4","Type":"ContainerStarted","Data":"634c02445ddc632f60ba0a85706e5c8d5362ab513278e136b7a9ea63c47507a3"} Jan 30 21:32:46 crc kubenswrapper[4914]: I0130 21:32:46.957894 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"63625a35-5028-4dda-b9b3-ec3910fd8385","Type":"ContainerStarted","Data":"e5b89913edb4a54eacaec45aff15b6fbcb156ced499b7c0b7be241e35e6cd010"} Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.012510 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.921354083 podStartE2EDuration="1m0.012483595s" podCreationTimestamp="2026-01-30 21:31:47 +0000 UTC" firstStartedPulling="2026-01-30 21:32:05.227745079 +0000 UTC m=+1058.666381880" lastFinishedPulling="2026-01-30 21:32:39.318874631 +0000 UTC m=+1092.757511392" observedRunningTime="2026-01-30 21:32:47.001250386 +0000 UTC m=+1100.439887187" watchObservedRunningTime="2026-01-30 21:32:47.012483595 +0000 UTC m=+1100.451120396" Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.557207 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:47 crc kubenswrapper[4914]: E0130 21:32:47.557497 4914 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 21:32:47 crc kubenswrapper[4914]: E0130 21:32:47.558482 4914 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 21:32:47 crc kubenswrapper[4914]: E0130 21:32:47.558637 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift podName:3a754950-b587-4c0a-85ed-e9669582ea2c nodeName:}" failed. No retries permitted until 2026-01-30 21:32:51.558618028 +0000 UTC m=+1104.997254799 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift") pod "swift-storage-0" (UID: "3a754950-b587-4c0a-85ed-e9669582ea2c") : configmap "swift-ring-files" not found Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.785069 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-4n6nd"] Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.786942 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.789089 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.789272 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.789424 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.793015 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-4n6nd"] Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.965671 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-dispersionconf\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.965778 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsfvn\" (UniqueName: \"kubernetes.io/projected/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-kube-api-access-jsfvn\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.965847 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-scripts\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.965878 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-swiftconf\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.965906 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-etc-swift\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.966069 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-ring-data-devices\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.966169 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-combined-ca-bundle\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:47 crc kubenswrapper[4914]: I0130 21:32:47.969357 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"da3bc7da-e810-4d0a-a7df-792c544f3a23","Type":"ContainerStarted","Data":"19b41dd1cb97894830cba6670bfa158726f252238a6207d5664394aa88040ce2"} Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.001976 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371977.852821 podStartE2EDuration="59.001953644s" podCreationTimestamp="2026-01-30 21:31:49 +0000 UTC" firstStartedPulling="2026-01-30 21:32:06.538615513 +0000 UTC m=+1059.977252274" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:32:47.992205931 +0000 UTC m=+1101.430842712" watchObservedRunningTime="2026-01-30 21:32:48.001953644 +0000 UTC m=+1101.440590425" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.068282 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-dispersionconf\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.068425 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsfvn\" (UniqueName: \"kubernetes.io/projected/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-kube-api-access-jsfvn\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.068517 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-scripts\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.068555 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-swiftconf\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.068602 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-etc-swift\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.069216 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-etc-swift\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.069472 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-scripts\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.069858 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-ring-data-devices\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.069934 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-combined-ca-bundle\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.070689 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-ring-data-devices\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.074458 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-swiftconf\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.085356 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-combined-ca-bundle\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.086343 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-dispersionconf\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.089047 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsfvn\" (UniqueName: \"kubernetes.io/projected/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-kube-api-access-jsfvn\") pod \"swift-ring-rebalance-4n6nd\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.106140 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.616009 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-4n6nd"] Jan 30 21:32:48 crc kubenswrapper[4914]: W0130 21:32:48.622925 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73c592d5_ad34_4357_b3a8_ecc0567e1e8d.slice/crio-5d9833d7e18d12f52af707bafeae69e30578e24be8545685ede1274142b848cb WatchSource:0}: Error finding container 5d9833d7e18d12f52af707bafeae69e30578e24be8545685ede1274142b848cb: Status 404 returned error can't find the container with id 5d9833d7e18d12f52af707bafeae69e30578e24be8545685ede1274142b848cb Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.978717 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"46107121-a72c-40a7-904c-24c6c33de7c4","Type":"ContainerStarted","Data":"c73d7a00a4a6359fc7804954e6351c36a505c6cef6f88f4bd3d9568161787ece"} Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.979040 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.979845 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4n6nd" event={"ID":"73c592d5-ad34-4357-b3a8-ecc0567e1e8d","Type":"ContainerStarted","Data":"5d9833d7e18d12f52af707bafeae69e30578e24be8545685ede1274142b848cb"} Jan 30 21:32:48 crc kubenswrapper[4914]: I0130 21:32:48.982523 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Jan 30 21:32:49 crc kubenswrapper[4914]: I0130 21:32:49.005410 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=16.780206714 podStartE2EDuration="56.00539638s" podCreationTimestamp="2026-01-30 21:31:53 +0000 UTC" firstStartedPulling="2026-01-30 21:32:06.534078394 +0000 UTC m=+1059.972715155" lastFinishedPulling="2026-01-30 21:32:45.75926806 +0000 UTC m=+1099.197904821" observedRunningTime="2026-01-30 21:32:49.002151721 +0000 UTC m=+1102.440788482" watchObservedRunningTime="2026-01-30 21:32:49.00539638 +0000 UTC m=+1102.444033141" Jan 30 21:32:49 crc kubenswrapper[4914]: I0130 21:32:49.175021 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 30 21:32:49 crc kubenswrapper[4914]: I0130 21:32:49.175425 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 30 21:32:49 crc kubenswrapper[4914]: I0130 21:32:49.990765 4914 generic.go:334] "Generic (PLEG): container finished" podID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerID="12aae14a6f79949923517112307201ed4c40a81d30ca3db9cb5e044d91f2e9ed" exitCode=0 Jan 30 21:32:49 crc kubenswrapper[4914]: I0130 21:32:49.993838 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1","Type":"ContainerDied","Data":"12aae14a6f79949923517112307201ed4c40a81d30ca3db9cb5e044d91f2e9ed"} Jan 30 21:32:50 crc kubenswrapper[4914]: I0130 21:32:50.507057 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 30 21:32:50 crc kubenswrapper[4914]: I0130 21:32:50.507422 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 30 21:32:51 crc kubenswrapper[4914]: I0130 21:32:51.262489 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 30 21:32:51 crc kubenswrapper[4914]: I0130 21:32:51.371890 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 30 21:32:51 crc kubenswrapper[4914]: I0130 21:32:51.647628 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:51 crc kubenswrapper[4914]: E0130 21:32:51.647852 4914 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 21:32:51 crc kubenswrapper[4914]: E0130 21:32:51.648019 4914 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 21:32:51 crc kubenswrapper[4914]: E0130 21:32:51.648110 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift podName:3a754950-b587-4c0a-85ed-e9669582ea2c nodeName:}" failed. No retries permitted until 2026-01-30 21:32:59.648091537 +0000 UTC m=+1113.086728308 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift") pod "swift-storage-0" (UID: "3a754950-b587-4c0a-85ed-e9669582ea2c") : configmap "swift-ring-files" not found Jan 30 21:32:52 crc kubenswrapper[4914]: I0130 21:32:52.613847 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 21:32:53 crc kubenswrapper[4914]: I0130 21:32:53.043341 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4n6nd" event={"ID":"73c592d5-ad34-4357-b3a8-ecc0567e1e8d","Type":"ContainerStarted","Data":"46b2f37102698b0e94d08f88a86558ef5dc6ecf95732f2068e59ec18f7da63ff"} Jan 30 21:32:53 crc kubenswrapper[4914]: I0130 21:32:53.072306 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-4n6nd" podStartSLOduration=2.491287146 podStartE2EDuration="6.072290181s" podCreationTimestamp="2026-01-30 21:32:47 +0000 UTC" firstStartedPulling="2026-01-30 21:32:48.624750166 +0000 UTC m=+1102.063386927" lastFinishedPulling="2026-01-30 21:32:52.205753201 +0000 UTC m=+1105.644389962" observedRunningTime="2026-01-30 21:32:53.068869199 +0000 UTC m=+1106.507505960" watchObservedRunningTime="2026-01-30 21:32:53.072290181 +0000 UTC m=+1106.510926942" Jan 30 21:32:53 crc kubenswrapper[4914]: I0130 21:32:53.359845 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:32:53 crc kubenswrapper[4914]: I0130 21:32:53.413023 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-q69cm"] Jan 30 21:32:53 crc kubenswrapper[4914]: I0130 21:32:53.413247 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" podUID="78a46162-d658-4f7b-8643-1ffca846a558" containerName="dnsmasq-dns" containerID="cri-o://ad7d3a229a9168cfbdcbb320a163b78af38f35ef8b96c2794b04e95375385f73" gracePeriod=10 Jan 30 21:32:53 crc kubenswrapper[4914]: I0130 21:32:53.976180 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.054932 4914 generic.go:334] "Generic (PLEG): container finished" podID="78a46162-d658-4f7b-8643-1ffca846a558" containerID="ad7d3a229a9168cfbdcbb320a163b78af38f35ef8b96c2794b04e95375385f73" exitCode=0 Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.054980 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" event={"ID":"78a46162-d658-4f7b-8643-1ffca846a558","Type":"ContainerDied","Data":"ad7d3a229a9168cfbdcbb320a163b78af38f35ef8b96c2794b04e95375385f73"} Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.055030 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" event={"ID":"78a46162-d658-4f7b-8643-1ffca846a558","Type":"ContainerDied","Data":"cbccbc5d9e1345b9cb2c9a2a34cd5ce531902f5ff576a13370e95bd61f68df14"} Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.055033 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-q69cm" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.055047 4914 scope.go:117] "RemoveContainer" containerID="ad7d3a229a9168cfbdcbb320a163b78af38f35ef8b96c2794b04e95375385f73" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.094647 4914 scope.go:117] "RemoveContainer" containerID="e7738e2c4b00ca0771a41133f6ccd7989b5ce5c1fd493ab27713ff4ad8919349" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.096440 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftgnd\" (UniqueName: \"kubernetes.io/projected/78a46162-d658-4f7b-8643-1ffca846a558-kube-api-access-ftgnd\") pod \"78a46162-d658-4f7b-8643-1ffca846a558\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.096485 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-ovsdbserver-sb\") pod \"78a46162-d658-4f7b-8643-1ffca846a558\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.096567 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-dns-svc\") pod \"78a46162-d658-4f7b-8643-1ffca846a558\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.096596 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-ovsdbserver-nb\") pod \"78a46162-d658-4f7b-8643-1ffca846a558\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.096630 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-config\") pod \"78a46162-d658-4f7b-8643-1ffca846a558\" (UID: \"78a46162-d658-4f7b-8643-1ffca846a558\") " Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.108012 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78a46162-d658-4f7b-8643-1ffca846a558-kube-api-access-ftgnd" (OuterVolumeSpecName: "kube-api-access-ftgnd") pod "78a46162-d658-4f7b-8643-1ffca846a558" (UID: "78a46162-d658-4f7b-8643-1ffca846a558"). InnerVolumeSpecName "kube-api-access-ftgnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.134580 4914 scope.go:117] "RemoveContainer" containerID="ad7d3a229a9168cfbdcbb320a163b78af38f35ef8b96c2794b04e95375385f73" Jan 30 21:32:54 crc kubenswrapper[4914]: E0130 21:32:54.135000 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad7d3a229a9168cfbdcbb320a163b78af38f35ef8b96c2794b04e95375385f73\": container with ID starting with ad7d3a229a9168cfbdcbb320a163b78af38f35ef8b96c2794b04e95375385f73 not found: ID does not exist" containerID="ad7d3a229a9168cfbdcbb320a163b78af38f35ef8b96c2794b04e95375385f73" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.135074 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad7d3a229a9168cfbdcbb320a163b78af38f35ef8b96c2794b04e95375385f73"} err="failed to get container status \"ad7d3a229a9168cfbdcbb320a163b78af38f35ef8b96c2794b04e95375385f73\": rpc error: code = NotFound desc = could not find container \"ad7d3a229a9168cfbdcbb320a163b78af38f35ef8b96c2794b04e95375385f73\": container with ID starting with ad7d3a229a9168cfbdcbb320a163b78af38f35ef8b96c2794b04e95375385f73 not found: ID does not exist" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.135122 4914 scope.go:117] "RemoveContainer" containerID="e7738e2c4b00ca0771a41133f6ccd7989b5ce5c1fd493ab27713ff4ad8919349" Jan 30 21:32:54 crc kubenswrapper[4914]: E0130 21:32:54.135421 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7738e2c4b00ca0771a41133f6ccd7989b5ce5c1fd493ab27713ff4ad8919349\": container with ID starting with e7738e2c4b00ca0771a41133f6ccd7989b5ce5c1fd493ab27713ff4ad8919349 not found: ID does not exist" containerID="e7738e2c4b00ca0771a41133f6ccd7989b5ce5c1fd493ab27713ff4ad8919349" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.135441 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7738e2c4b00ca0771a41133f6ccd7989b5ce5c1fd493ab27713ff4ad8919349"} err="failed to get container status \"e7738e2c4b00ca0771a41133f6ccd7989b5ce5c1fd493ab27713ff4ad8919349\": rpc error: code = NotFound desc = could not find container \"e7738e2c4b00ca0771a41133f6ccd7989b5ce5c1fd493ab27713ff4ad8919349\": container with ID starting with e7738e2c4b00ca0771a41133f6ccd7989b5ce5c1fd493ab27713ff4ad8919349 not found: ID does not exist" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.151573 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "78a46162-d658-4f7b-8643-1ffca846a558" (UID: "78a46162-d658-4f7b-8643-1ffca846a558"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.162614 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "78a46162-d658-4f7b-8643-1ffca846a558" (UID: "78a46162-d658-4f7b-8643-1ffca846a558"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.163580 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-config" (OuterVolumeSpecName: "config") pod "78a46162-d658-4f7b-8643-1ffca846a558" (UID: "78a46162-d658-4f7b-8643-1ffca846a558"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.165432 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "78a46162-d658-4f7b-8643-1ffca846a558" (UID: "78a46162-d658-4f7b-8643-1ffca846a558"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.199144 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftgnd\" (UniqueName: \"kubernetes.io/projected/78a46162-d658-4f7b-8643-1ffca846a558-kube-api-access-ftgnd\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.199183 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.199197 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.199210 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.199222 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a46162-d658-4f7b-8643-1ffca846a558-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.349420 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.384223 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-q69cm"] Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.390949 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-q69cm"] Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.607603 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 30 21:32:54 crc kubenswrapper[4914]: I0130 21:32:54.720041 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 30 21:32:55 crc kubenswrapper[4914]: I0130 21:32:55.831502 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78a46162-d658-4f7b-8643-1ffca846a558" path="/var/lib/kubelet/pods/78a46162-d658-4f7b-8643-1ffca846a558/volumes" Jan 30 21:32:55 crc kubenswrapper[4914]: I0130 21:32:55.979843 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-dd72-account-create-update-dltrr"] Jan 30 21:32:55 crc kubenswrapper[4914]: E0130 21:32:55.980239 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a46162-d658-4f7b-8643-1ffca846a558" containerName="init" Jan 30 21:32:55 crc kubenswrapper[4914]: I0130 21:32:55.980255 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a46162-d658-4f7b-8643-1ffca846a558" containerName="init" Jan 30 21:32:55 crc kubenswrapper[4914]: E0130 21:32:55.980295 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a46162-d658-4f7b-8643-1ffca846a558" containerName="dnsmasq-dns" Jan 30 21:32:55 crc kubenswrapper[4914]: I0130 21:32:55.980301 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a46162-d658-4f7b-8643-1ffca846a558" containerName="dnsmasq-dns" Jan 30 21:32:55 crc kubenswrapper[4914]: I0130 21:32:55.980484 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="78a46162-d658-4f7b-8643-1ffca846a558" containerName="dnsmasq-dns" Jan 30 21:32:55 crc kubenswrapper[4914]: I0130 21:32:55.981124 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-dd72-account-create-update-dltrr" Jan 30 21:32:55 crc kubenswrapper[4914]: I0130 21:32:55.983127 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 30 21:32:55 crc kubenswrapper[4914]: I0130 21:32:55.992520 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-px97v"] Jan 30 21:32:55 crc kubenswrapper[4914]: I0130 21:32:55.993697 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-px97v" Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.000264 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-dd72-account-create-update-dltrr"] Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.007886 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-px97v"] Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.053786 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f0e095e-5ba2-4450-9006-3d471fd30225-operator-scripts\") pod \"glance-dd72-account-create-update-dltrr\" (UID: \"8f0e095e-5ba2-4450-9006-3d471fd30225\") " pod="openstack/glance-dd72-account-create-update-dltrr" Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.054248 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwhpz\" (UniqueName: \"kubernetes.io/projected/8f0e095e-5ba2-4450-9006-3d471fd30225-kube-api-access-dwhpz\") pod \"glance-dd72-account-create-update-dltrr\" (UID: \"8f0e095e-5ba2-4450-9006-3d471fd30225\") " pod="openstack/glance-dd72-account-create-update-dltrr" Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.156422 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwhpz\" (UniqueName: \"kubernetes.io/projected/8f0e095e-5ba2-4450-9006-3d471fd30225-kube-api-access-dwhpz\") pod \"glance-dd72-account-create-update-dltrr\" (UID: \"8f0e095e-5ba2-4450-9006-3d471fd30225\") " pod="openstack/glance-dd72-account-create-update-dltrr" Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.156498 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87a04c4b-a588-403c-b989-d1eb41d4cd13-operator-scripts\") pod \"glance-db-create-px97v\" (UID: \"87a04c4b-a588-403c-b989-d1eb41d4cd13\") " pod="openstack/glance-db-create-px97v" Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.156617 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq5m9\" (UniqueName: \"kubernetes.io/projected/87a04c4b-a588-403c-b989-d1eb41d4cd13-kube-api-access-bq5m9\") pod \"glance-db-create-px97v\" (UID: \"87a04c4b-a588-403c-b989-d1eb41d4cd13\") " pod="openstack/glance-db-create-px97v" Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.156750 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f0e095e-5ba2-4450-9006-3d471fd30225-operator-scripts\") pod \"glance-dd72-account-create-update-dltrr\" (UID: \"8f0e095e-5ba2-4450-9006-3d471fd30225\") " pod="openstack/glance-dd72-account-create-update-dltrr" Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.157825 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f0e095e-5ba2-4450-9006-3d471fd30225-operator-scripts\") pod \"glance-dd72-account-create-update-dltrr\" (UID: \"8f0e095e-5ba2-4450-9006-3d471fd30225\") " pod="openstack/glance-dd72-account-create-update-dltrr" Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.188977 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwhpz\" (UniqueName: \"kubernetes.io/projected/8f0e095e-5ba2-4450-9006-3d471fd30225-kube-api-access-dwhpz\") pod \"glance-dd72-account-create-update-dltrr\" (UID: \"8f0e095e-5ba2-4450-9006-3d471fd30225\") " pod="openstack/glance-dd72-account-create-update-dltrr" Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.259035 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq5m9\" (UniqueName: \"kubernetes.io/projected/87a04c4b-a588-403c-b989-d1eb41d4cd13-kube-api-access-bq5m9\") pod \"glance-db-create-px97v\" (UID: \"87a04c4b-a588-403c-b989-d1eb41d4cd13\") " pod="openstack/glance-db-create-px97v" Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.259220 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87a04c4b-a588-403c-b989-d1eb41d4cd13-operator-scripts\") pod \"glance-db-create-px97v\" (UID: \"87a04c4b-a588-403c-b989-d1eb41d4cd13\") " pod="openstack/glance-db-create-px97v" Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.259966 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87a04c4b-a588-403c-b989-d1eb41d4cd13-operator-scripts\") pod \"glance-db-create-px97v\" (UID: \"87a04c4b-a588-403c-b989-d1eb41d4cd13\") " pod="openstack/glance-db-create-px97v" Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.282227 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq5m9\" (UniqueName: \"kubernetes.io/projected/87a04c4b-a588-403c-b989-d1eb41d4cd13-kube-api-access-bq5m9\") pod \"glance-db-create-px97v\" (UID: \"87a04c4b-a588-403c-b989-d1eb41d4cd13\") " pod="openstack/glance-db-create-px97v" Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.315800 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-dd72-account-create-update-dltrr" Jan 30 21:32:56 crc kubenswrapper[4914]: I0130 21:32:56.325775 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-px97v" Jan 30 21:32:57 crc kubenswrapper[4914]: I0130 21:32:57.009038 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 30 21:32:57 crc kubenswrapper[4914]: I0130 21:32:57.743981 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rdzm9" podUID="3f063a16-987d-4378-b889-966755034c3e" containerName="ovn-controller" probeResult="failure" output=< Jan 30 21:32:57 crc kubenswrapper[4914]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 21:32:57 crc kubenswrapper[4914]: > Jan 30 21:32:57 crc kubenswrapper[4914]: I0130 21:32:57.849591 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vdgd5"] Jan 30 21:32:57 crc kubenswrapper[4914]: I0130 21:32:57.850635 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vdgd5" Jan 30 21:32:57 crc kubenswrapper[4914]: I0130 21:32:57.855156 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 30 21:32:57 crc kubenswrapper[4914]: I0130 21:32:57.864867 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vdgd5"] Jan 30 21:32:58 crc kubenswrapper[4914]: I0130 21:32:58.009116 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx4jh\" (UniqueName: \"kubernetes.io/projected/5bd50441-346e-4886-8273-5196b7d5e35d-kube-api-access-jx4jh\") pod \"root-account-create-update-vdgd5\" (UID: \"5bd50441-346e-4886-8273-5196b7d5e35d\") " pod="openstack/root-account-create-update-vdgd5" Jan 30 21:32:58 crc kubenswrapper[4914]: I0130 21:32:58.009185 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd50441-346e-4886-8273-5196b7d5e35d-operator-scripts\") pod \"root-account-create-update-vdgd5\" (UID: \"5bd50441-346e-4886-8273-5196b7d5e35d\") " pod="openstack/root-account-create-update-vdgd5" Jan 30 21:32:58 crc kubenswrapper[4914]: I0130 21:32:58.110875 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx4jh\" (UniqueName: \"kubernetes.io/projected/5bd50441-346e-4886-8273-5196b7d5e35d-kube-api-access-jx4jh\") pod \"root-account-create-update-vdgd5\" (UID: \"5bd50441-346e-4886-8273-5196b7d5e35d\") " pod="openstack/root-account-create-update-vdgd5" Jan 30 21:32:58 crc kubenswrapper[4914]: I0130 21:32:58.110921 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd50441-346e-4886-8273-5196b7d5e35d-operator-scripts\") pod \"root-account-create-update-vdgd5\" (UID: \"5bd50441-346e-4886-8273-5196b7d5e35d\") " pod="openstack/root-account-create-update-vdgd5" Jan 30 21:32:58 crc kubenswrapper[4914]: I0130 21:32:58.111851 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd50441-346e-4886-8273-5196b7d5e35d-operator-scripts\") pod \"root-account-create-update-vdgd5\" (UID: \"5bd50441-346e-4886-8273-5196b7d5e35d\") " pod="openstack/root-account-create-update-vdgd5" Jan 30 21:32:58 crc kubenswrapper[4914]: I0130 21:32:58.131434 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx4jh\" (UniqueName: \"kubernetes.io/projected/5bd50441-346e-4886-8273-5196b7d5e35d-kube-api-access-jx4jh\") pod \"root-account-create-update-vdgd5\" (UID: \"5bd50441-346e-4886-8273-5196b7d5e35d\") " pod="openstack/root-account-create-update-vdgd5" Jan 30 21:32:58 crc kubenswrapper[4914]: I0130 21:32:58.196622 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vdgd5" Jan 30 21:32:59 crc kubenswrapper[4914]: I0130 21:32:59.441914 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-px97v"] Jan 30 21:32:59 crc kubenswrapper[4914]: W0130 21:32:59.448032 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87a04c4b_a588_403c_b989_d1eb41d4cd13.slice/crio-4c6a175b644dad93b6ea2e77313ff016abc1c6f6ba7d0c2eec0d5057f7a5c07e WatchSource:0}: Error finding container 4c6a175b644dad93b6ea2e77313ff016abc1c6f6ba7d0c2eec0d5057f7a5c07e: Status 404 returned error can't find the container with id 4c6a175b644dad93b6ea2e77313ff016abc1c6f6ba7d0c2eec0d5057f7a5c07e Jan 30 21:32:59 crc kubenswrapper[4914]: I0130 21:32:59.552696 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vdgd5"] Jan 30 21:32:59 crc kubenswrapper[4914]: W0130 21:32:59.573059 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bd50441_346e_4886_8273_5196b7d5e35d.slice/crio-b6b5e3453ec2a1e7a65292c33c2e5a74e94c464441a40cfe42ff843048306eed WatchSource:0}: Error finding container b6b5e3453ec2a1e7a65292c33c2e5a74e94c464441a40cfe42ff843048306eed: Status 404 returned error can't find the container with id b6b5e3453ec2a1e7a65292c33c2e5a74e94c464441a40cfe42ff843048306eed Jan 30 21:32:59 crc kubenswrapper[4914]: I0130 21:32:59.599753 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-dd72-account-create-update-dltrr"] Jan 30 21:32:59 crc kubenswrapper[4914]: W0130 21:32:59.604683 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f0e095e_5ba2_4450_9006_3d471fd30225.slice/crio-b8e8ac70ac13e0e3ddbe4d5427554eb4202360c7b7c04d3b7124de91b4f0b096 WatchSource:0}: Error finding container b8e8ac70ac13e0e3ddbe4d5427554eb4202360c7b7c04d3b7124de91b4f0b096: Status 404 returned error can't find the container with id b8e8ac70ac13e0e3ddbe4d5427554eb4202360c7b7c04d3b7124de91b4f0b096 Jan 30 21:32:59 crc kubenswrapper[4914]: I0130 21:32:59.744036 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:32:59 crc kubenswrapper[4914]: E0130 21:32:59.744248 4914 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 21:32:59 crc kubenswrapper[4914]: E0130 21:32:59.744275 4914 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 21:32:59 crc kubenswrapper[4914]: E0130 21:32:59.744344 4914 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift podName:3a754950-b587-4c0a-85ed-e9669582ea2c nodeName:}" failed. No retries permitted until 2026-01-30 21:33:15.744321726 +0000 UTC m=+1129.182958487 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift") pod "swift-storage-0" (UID: "3a754950-b587-4c0a-85ed-e9669582ea2c") : configmap "swift-ring-files" not found Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.139873 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1","Type":"ContainerStarted","Data":"1dab7a42f9d9c2fda2084cab46af4fe31c657028a26e53b83a24bc3de3ccf0f8"} Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.142189 4914 generic.go:334] "Generic (PLEG): container finished" podID="8f0e095e-5ba2-4450-9006-3d471fd30225" containerID="ca2f074c48f1a84505896a3ddbe39d8760969f813b16824be596fb08d08b9dff" exitCode=0 Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.142279 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-dd72-account-create-update-dltrr" event={"ID":"8f0e095e-5ba2-4450-9006-3d471fd30225","Type":"ContainerDied","Data":"ca2f074c48f1a84505896a3ddbe39d8760969f813b16824be596fb08d08b9dff"} Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.142311 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-dd72-account-create-update-dltrr" event={"ID":"8f0e095e-5ba2-4450-9006-3d471fd30225","Type":"ContainerStarted","Data":"b8e8ac70ac13e0e3ddbe4d5427554eb4202360c7b7c04d3b7124de91b4f0b096"} Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.144037 4914 generic.go:334] "Generic (PLEG): container finished" podID="87a04c4b-a588-403c-b989-d1eb41d4cd13" containerID="defaddb59a3666031661e46a1e73c0d1ec9c8c43cb0c6ad9646fc9721b995ba3" exitCode=0 Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.144121 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-px97v" event={"ID":"87a04c4b-a588-403c-b989-d1eb41d4cd13","Type":"ContainerDied","Data":"defaddb59a3666031661e46a1e73c0d1ec9c8c43cb0c6ad9646fc9721b995ba3"} Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.144148 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-px97v" event={"ID":"87a04c4b-a588-403c-b989-d1eb41d4cd13","Type":"ContainerStarted","Data":"4c6a175b644dad93b6ea2e77313ff016abc1c6f6ba7d0c2eec0d5057f7a5c07e"} Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.145867 4914 generic.go:334] "Generic (PLEG): container finished" podID="5bd50441-346e-4886-8273-5196b7d5e35d" containerID="272968fda7cc662786817a7a393a09260918e5cf25029cbdea32b8853cb52967" exitCode=0 Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.145922 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vdgd5" event={"ID":"5bd50441-346e-4886-8273-5196b7d5e35d","Type":"ContainerDied","Data":"272968fda7cc662786817a7a393a09260918e5cf25029cbdea32b8853cb52967"} Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.145942 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vdgd5" event={"ID":"5bd50441-346e-4886-8273-5196b7d5e35d","Type":"ContainerStarted","Data":"b6b5e3453ec2a1e7a65292c33c2e5a74e94c464441a40cfe42ff843048306eed"} Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.337894 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jfgqq"] Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.342055 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jfgqq" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.351637 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jfgqq"] Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.424800 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1945-account-create-update-crf4j"] Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.426291 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1945-account-create-update-crf4j" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.430350 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.448654 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1945-account-create-update-crf4j"] Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.462265 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq4vh\" (UniqueName: \"kubernetes.io/projected/0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656-kube-api-access-kq4vh\") pod \"keystone-db-create-jfgqq\" (UID: \"0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656\") " pod="openstack/keystone-db-create-jfgqq" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.462352 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656-operator-scripts\") pod \"keystone-db-create-jfgqq\" (UID: \"0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656\") " pod="openstack/keystone-db-create-jfgqq" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.564029 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq4vh\" (UniqueName: \"kubernetes.io/projected/0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656-kube-api-access-kq4vh\") pod \"keystone-db-create-jfgqq\" (UID: \"0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656\") " pod="openstack/keystone-db-create-jfgqq" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.564069 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z82jw\" (UniqueName: \"kubernetes.io/projected/3d29c2c9-d6e0-4691-b53a-ebd2205fbba3-kube-api-access-z82jw\") pod \"keystone-1945-account-create-update-crf4j\" (UID: \"3d29c2c9-d6e0-4691-b53a-ebd2205fbba3\") " pod="openstack/keystone-1945-account-create-update-crf4j" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.564124 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656-operator-scripts\") pod \"keystone-db-create-jfgqq\" (UID: \"0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656\") " pod="openstack/keystone-db-create-jfgqq" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.564636 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d29c2c9-d6e0-4691-b53a-ebd2205fbba3-operator-scripts\") pod \"keystone-1945-account-create-update-crf4j\" (UID: \"3d29c2c9-d6e0-4691-b53a-ebd2205fbba3\") " pod="openstack/keystone-1945-account-create-update-crf4j" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.565576 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656-operator-scripts\") pod \"keystone-db-create-jfgqq\" (UID: \"0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656\") " pod="openstack/keystone-db-create-jfgqq" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.588815 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq4vh\" (UniqueName: \"kubernetes.io/projected/0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656-kube-api-access-kq4vh\") pod \"keystone-db-create-jfgqq\" (UID: \"0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656\") " pod="openstack/keystone-db-create-jfgqq" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.625080 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-tvrvx"] Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.626313 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tvrvx" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.646829 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-tvrvx"] Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.666688 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d29c2c9-d6e0-4691-b53a-ebd2205fbba3-operator-scripts\") pod \"keystone-1945-account-create-update-crf4j\" (UID: \"3d29c2c9-d6e0-4691-b53a-ebd2205fbba3\") " pod="openstack/keystone-1945-account-create-update-crf4j" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.666780 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z82jw\" (UniqueName: \"kubernetes.io/projected/3d29c2c9-d6e0-4691-b53a-ebd2205fbba3-kube-api-access-z82jw\") pod \"keystone-1945-account-create-update-crf4j\" (UID: \"3d29c2c9-d6e0-4691-b53a-ebd2205fbba3\") " pod="openstack/keystone-1945-account-create-update-crf4j" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.667902 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d29c2c9-d6e0-4691-b53a-ebd2205fbba3-operator-scripts\") pod \"keystone-1945-account-create-update-crf4j\" (UID: \"3d29c2c9-d6e0-4691-b53a-ebd2205fbba3\") " pod="openstack/keystone-1945-account-create-update-crf4j" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.683360 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jfgqq" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.683779 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z82jw\" (UniqueName: \"kubernetes.io/projected/3d29c2c9-d6e0-4691-b53a-ebd2205fbba3-kube-api-access-z82jw\") pod \"keystone-1945-account-create-update-crf4j\" (UID: \"3d29c2c9-d6e0-4691-b53a-ebd2205fbba3\") " pod="openstack/keystone-1945-account-create-update-crf4j" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.736226 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7688-account-create-update-m7mwk"] Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.737426 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7688-account-create-update-m7mwk" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.739496 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.746214 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7688-account-create-update-m7mwk"] Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.766133 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1945-account-create-update-crf4j" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.768586 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48tw2\" (UniqueName: \"kubernetes.io/projected/2ed24889-cc02-4c4d-ba7f-7b397f1516ae-kube-api-access-48tw2\") pod \"placement-db-create-tvrvx\" (UID: \"2ed24889-cc02-4c4d-ba7f-7b397f1516ae\") " pod="openstack/placement-db-create-tvrvx" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.768655 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ed24889-cc02-4c4d-ba7f-7b397f1516ae-operator-scripts\") pod \"placement-db-create-tvrvx\" (UID: \"2ed24889-cc02-4c4d-ba7f-7b397f1516ae\") " pod="openstack/placement-db-create-tvrvx" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.871218 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffzq2\" (UniqueName: \"kubernetes.io/projected/d6d8a7cc-b7c4-408f-a50d-9abb436695d8-kube-api-access-ffzq2\") pod \"placement-7688-account-create-update-m7mwk\" (UID: \"d6d8a7cc-b7c4-408f-a50d-9abb436695d8\") " pod="openstack/placement-7688-account-create-update-m7mwk" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.872035 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48tw2\" (UniqueName: \"kubernetes.io/projected/2ed24889-cc02-4c4d-ba7f-7b397f1516ae-kube-api-access-48tw2\") pod \"placement-db-create-tvrvx\" (UID: \"2ed24889-cc02-4c4d-ba7f-7b397f1516ae\") " pod="openstack/placement-db-create-tvrvx" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.872180 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6d8a7cc-b7c4-408f-a50d-9abb436695d8-operator-scripts\") pod \"placement-7688-account-create-update-m7mwk\" (UID: \"d6d8a7cc-b7c4-408f-a50d-9abb436695d8\") " pod="openstack/placement-7688-account-create-update-m7mwk" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.872217 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ed24889-cc02-4c4d-ba7f-7b397f1516ae-operator-scripts\") pod \"placement-db-create-tvrvx\" (UID: \"2ed24889-cc02-4c4d-ba7f-7b397f1516ae\") " pod="openstack/placement-db-create-tvrvx" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.873688 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ed24889-cc02-4c4d-ba7f-7b397f1516ae-operator-scripts\") pod \"placement-db-create-tvrvx\" (UID: \"2ed24889-cc02-4c4d-ba7f-7b397f1516ae\") " pod="openstack/placement-db-create-tvrvx" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.891512 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48tw2\" (UniqueName: \"kubernetes.io/projected/2ed24889-cc02-4c4d-ba7f-7b397f1516ae-kube-api-access-48tw2\") pod \"placement-db-create-tvrvx\" (UID: \"2ed24889-cc02-4c4d-ba7f-7b397f1516ae\") " pod="openstack/placement-db-create-tvrvx" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.951880 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tvrvx" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.973592 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffzq2\" (UniqueName: \"kubernetes.io/projected/d6d8a7cc-b7c4-408f-a50d-9abb436695d8-kube-api-access-ffzq2\") pod \"placement-7688-account-create-update-m7mwk\" (UID: \"d6d8a7cc-b7c4-408f-a50d-9abb436695d8\") " pod="openstack/placement-7688-account-create-update-m7mwk" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.973841 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6d8a7cc-b7c4-408f-a50d-9abb436695d8-operator-scripts\") pod \"placement-7688-account-create-update-m7mwk\" (UID: \"d6d8a7cc-b7c4-408f-a50d-9abb436695d8\") " pod="openstack/placement-7688-account-create-update-m7mwk" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.974656 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6d8a7cc-b7c4-408f-a50d-9abb436695d8-operator-scripts\") pod \"placement-7688-account-create-update-m7mwk\" (UID: \"d6d8a7cc-b7c4-408f-a50d-9abb436695d8\") " pod="openstack/placement-7688-account-create-update-m7mwk" Jan 30 21:33:00 crc kubenswrapper[4914]: I0130 21:33:00.994585 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffzq2\" (UniqueName: \"kubernetes.io/projected/d6d8a7cc-b7c4-408f-a50d-9abb436695d8-kube-api-access-ffzq2\") pod \"placement-7688-account-create-update-m7mwk\" (UID: \"d6d8a7cc-b7c4-408f-a50d-9abb436695d8\") " pod="openstack/placement-7688-account-create-update-m7mwk" Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.125230 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7688-account-create-update-m7mwk" Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.169446 4914 generic.go:334] "Generic (PLEG): container finished" podID="73c592d5-ad34-4357-b3a8-ecc0567e1e8d" containerID="46b2f37102698b0e94d08f88a86558ef5dc6ecf95732f2068e59ec18f7da63ff" exitCode=0 Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.169647 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4n6nd" event={"ID":"73c592d5-ad34-4357-b3a8-ecc0567e1e8d","Type":"ContainerDied","Data":"46b2f37102698b0e94d08f88a86558ef5dc6ecf95732f2068e59ec18f7da63ff"} Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.189666 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jfgqq"] Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.292909 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1945-account-create-update-crf4j"] Jan 30 21:33:01 crc kubenswrapper[4914]: W0130 21:33:01.308153 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d29c2c9_d6e0_4691_b53a_ebd2205fbba3.slice/crio-75cb55c7cd2a9a1977930d43a02ba5ee29f7cee662a5bdc2e223151b889d609b WatchSource:0}: Error finding container 75cb55c7cd2a9a1977930d43a02ba5ee29f7cee662a5bdc2e223151b889d609b: Status 404 returned error can't find the container with id 75cb55c7cd2a9a1977930d43a02ba5ee29f7cee662a5bdc2e223151b889d609b Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.449810 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-tvrvx"] Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.721642 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-px97v" Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.748599 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vdgd5" Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.806136 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7688-account-create-update-m7mwk"] Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.824169 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-dd72-account-create-update-dltrr" Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.909508 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq5m9\" (UniqueName: \"kubernetes.io/projected/87a04c4b-a588-403c-b989-d1eb41d4cd13-kube-api-access-bq5m9\") pod \"87a04c4b-a588-403c-b989-d1eb41d4cd13\" (UID: \"87a04c4b-a588-403c-b989-d1eb41d4cd13\") " Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.910113 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87a04c4b-a588-403c-b989-d1eb41d4cd13-operator-scripts\") pod \"87a04c4b-a588-403c-b989-d1eb41d4cd13\" (UID: \"87a04c4b-a588-403c-b989-d1eb41d4cd13\") " Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.910149 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd50441-346e-4886-8273-5196b7d5e35d-operator-scripts\") pod \"5bd50441-346e-4886-8273-5196b7d5e35d\" (UID: \"5bd50441-346e-4886-8273-5196b7d5e35d\") " Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.910733 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx4jh\" (UniqueName: \"kubernetes.io/projected/5bd50441-346e-4886-8273-5196b7d5e35d-kube-api-access-jx4jh\") pod \"5bd50441-346e-4886-8273-5196b7d5e35d\" (UID: \"5bd50441-346e-4886-8273-5196b7d5e35d\") " Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.912574 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a04c4b-a588-403c-b989-d1eb41d4cd13-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87a04c4b-a588-403c-b989-d1eb41d4cd13" (UID: "87a04c4b-a588-403c-b989-d1eb41d4cd13"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.913491 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bd50441-346e-4886-8273-5196b7d5e35d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5bd50441-346e-4886-8273-5196b7d5e35d" (UID: "5bd50441-346e-4886-8273-5196b7d5e35d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.913903 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87a04c4b-a588-403c-b989-d1eb41d4cd13-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.913930 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd50441-346e-4886-8273-5196b7d5e35d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.923734 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87a04c4b-a588-403c-b989-d1eb41d4cd13-kube-api-access-bq5m9" (OuterVolumeSpecName: "kube-api-access-bq5m9") pod "87a04c4b-a588-403c-b989-d1eb41d4cd13" (UID: "87a04c4b-a588-403c-b989-d1eb41d4cd13"). InnerVolumeSpecName "kube-api-access-bq5m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:01 crc kubenswrapper[4914]: I0130 21:33:01.925385 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd50441-346e-4886-8273-5196b7d5e35d-kube-api-access-jx4jh" (OuterVolumeSpecName: "kube-api-access-jx4jh") pod "5bd50441-346e-4886-8273-5196b7d5e35d" (UID: "5bd50441-346e-4886-8273-5196b7d5e35d"). InnerVolumeSpecName "kube-api-access-jx4jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.015884 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f0e095e-5ba2-4450-9006-3d471fd30225-operator-scripts\") pod \"8f0e095e-5ba2-4450-9006-3d471fd30225\" (UID: \"8f0e095e-5ba2-4450-9006-3d471fd30225\") " Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.015967 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwhpz\" (UniqueName: \"kubernetes.io/projected/8f0e095e-5ba2-4450-9006-3d471fd30225-kube-api-access-dwhpz\") pod \"8f0e095e-5ba2-4450-9006-3d471fd30225\" (UID: \"8f0e095e-5ba2-4450-9006-3d471fd30225\") " Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.016350 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f0e095e-5ba2-4450-9006-3d471fd30225-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f0e095e-5ba2-4450-9006-3d471fd30225" (UID: "8f0e095e-5ba2-4450-9006-3d471fd30225"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.016897 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx4jh\" (UniqueName: \"kubernetes.io/projected/5bd50441-346e-4886-8273-5196b7d5e35d-kube-api-access-jx4jh\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.016918 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq5m9\" (UniqueName: \"kubernetes.io/projected/87a04c4b-a588-403c-b989-d1eb41d4cd13-kube-api-access-bq5m9\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.016929 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f0e095e-5ba2-4450-9006-3d471fd30225-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.019737 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0e095e-5ba2-4450-9006-3d471fd30225-kube-api-access-dwhpz" (OuterVolumeSpecName: "kube-api-access-dwhpz") pod "8f0e095e-5ba2-4450-9006-3d471fd30225" (UID: "8f0e095e-5ba2-4450-9006-3d471fd30225"). InnerVolumeSpecName "kube-api-access-dwhpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.118817 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwhpz\" (UniqueName: \"kubernetes.io/projected/8f0e095e-5ba2-4450-9006-3d471fd30225-kube-api-access-dwhpz\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.178582 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-px97v" event={"ID":"87a04c4b-a588-403c-b989-d1eb41d4cd13","Type":"ContainerDied","Data":"4c6a175b644dad93b6ea2e77313ff016abc1c6f6ba7d0c2eec0d5057f7a5c07e"} Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.178617 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c6a175b644dad93b6ea2e77313ff016abc1c6f6ba7d0c2eec0d5057f7a5c07e" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.178625 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-px97v" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.180175 4914 generic.go:334] "Generic (PLEG): container finished" podID="0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656" containerID="651d13073f2a0535d794b564d524f0be0667c6cb530bc8564966ffd0f705236b" exitCode=0 Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.180232 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jfgqq" event={"ID":"0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656","Type":"ContainerDied","Data":"651d13073f2a0535d794b564d524f0be0667c6cb530bc8564966ffd0f705236b"} Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.180250 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jfgqq" event={"ID":"0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656","Type":"ContainerStarted","Data":"192cfd544baf2f39e94bc320f33698060c8c6601b07dff23c755d7f4d60d245c"} Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.182341 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vdgd5" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.182373 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vdgd5" event={"ID":"5bd50441-346e-4886-8273-5196b7d5e35d","Type":"ContainerDied","Data":"b6b5e3453ec2a1e7a65292c33c2e5a74e94c464441a40cfe42ff843048306eed"} Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.182407 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6b5e3453ec2a1e7a65292c33c2e5a74e94c464441a40cfe42ff843048306eed" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.189543 4914 generic.go:334] "Generic (PLEG): container finished" podID="3d29c2c9-d6e0-4691-b53a-ebd2205fbba3" containerID="6b3aa84496d9457083f50d01b95c90e861e78b5b44e664adc23c6ee98b777844" exitCode=0 Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.189845 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1945-account-create-update-crf4j" event={"ID":"3d29c2c9-d6e0-4691-b53a-ebd2205fbba3","Type":"ContainerDied","Data":"6b3aa84496d9457083f50d01b95c90e861e78b5b44e664adc23c6ee98b777844"} Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.189897 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1945-account-create-update-crf4j" event={"ID":"3d29c2c9-d6e0-4691-b53a-ebd2205fbba3","Type":"ContainerStarted","Data":"75cb55c7cd2a9a1977930d43a02ba5ee29f7cee662a5bdc2e223151b889d609b"} Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.191762 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7688-account-create-update-m7mwk" event={"ID":"d6d8a7cc-b7c4-408f-a50d-9abb436695d8","Type":"ContainerStarted","Data":"dca335e33d26033d2a549eec28d093f1e5329310df9ddac334aa3550e922e28d"} Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.191808 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7688-account-create-update-m7mwk" event={"ID":"d6d8a7cc-b7c4-408f-a50d-9abb436695d8","Type":"ContainerStarted","Data":"6f3b77c407cb072881967172c62eeec549f17752c2f3ae27d6b5c2abe15776d5"} Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.194523 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1","Type":"ContainerStarted","Data":"16d66c3ee650e0ad391e0dbde55efa72c1d2afa5033907e89093e75d51aee2c9"} Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.197420 4914 generic.go:334] "Generic (PLEG): container finished" podID="2ed24889-cc02-4c4d-ba7f-7b397f1516ae" containerID="ef8f74605cd99918fd80a790fad82d36171ec6bd3b5e9f0e93320ff1147f1b06" exitCode=0 Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.197483 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tvrvx" event={"ID":"2ed24889-cc02-4c4d-ba7f-7b397f1516ae","Type":"ContainerDied","Data":"ef8f74605cd99918fd80a790fad82d36171ec6bd3b5e9f0e93320ff1147f1b06"} Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.197527 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tvrvx" event={"ID":"2ed24889-cc02-4c4d-ba7f-7b397f1516ae","Type":"ContainerStarted","Data":"295ae8428df19bf001843640ca28fa96d4d85e9ea608dd2287c2ddab61148223"} Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.203485 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-dd72-account-create-update-dltrr" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.203549 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-dd72-account-create-update-dltrr" event={"ID":"8f0e095e-5ba2-4450-9006-3d471fd30225","Type":"ContainerDied","Data":"b8e8ac70ac13e0e3ddbe4d5427554eb4202360c7b7c04d3b7124de91b4f0b096"} Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.203583 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8e8ac70ac13e0e3ddbe4d5427554eb4202360c7b7c04d3b7124de91b4f0b096" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.262305 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7688-account-create-update-m7mwk" podStartSLOduration=2.26228428 podStartE2EDuration="2.26228428s" podCreationTimestamp="2026-01-30 21:33:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:02.259624316 +0000 UTC m=+1115.698261077" watchObservedRunningTime="2026-01-30 21:33:02.26228428 +0000 UTC m=+1115.700921041" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.603584 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.740639 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-dispersionconf\") pod \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.740691 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-ring-data-devices\") pod \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.740791 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsfvn\" (UniqueName: \"kubernetes.io/projected/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-kube-api-access-jsfvn\") pod \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.740849 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-swiftconf\") pod \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.740887 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-etc-swift\") pod \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.740914 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-combined-ca-bundle\") pod \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.740934 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-scripts\") pod \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\" (UID: \"73c592d5-ad34-4357-b3a8-ecc0567e1e8d\") " Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.741413 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "73c592d5-ad34-4357-b3a8-ecc0567e1e8d" (UID: "73c592d5-ad34-4357-b3a8-ecc0567e1e8d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.742013 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "73c592d5-ad34-4357-b3a8-ecc0567e1e8d" (UID: "73c592d5-ad34-4357-b3a8-ecc0567e1e8d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.747951 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rdzm9" podUID="3f063a16-987d-4378-b889-966755034c3e" containerName="ovn-controller" probeResult="failure" output=< Jan 30 21:33:02 crc kubenswrapper[4914]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 21:33:02 crc kubenswrapper[4914]: > Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.748222 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-kube-api-access-jsfvn" (OuterVolumeSpecName: "kube-api-access-jsfvn") pod "73c592d5-ad34-4357-b3a8-ecc0567e1e8d" (UID: "73c592d5-ad34-4357-b3a8-ecc0567e1e8d"). InnerVolumeSpecName "kube-api-access-jsfvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.751391 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "73c592d5-ad34-4357-b3a8-ecc0567e1e8d" (UID: "73c592d5-ad34-4357-b3a8-ecc0567e1e8d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.776056 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "73c592d5-ad34-4357-b3a8-ecc0567e1e8d" (UID: "73c592d5-ad34-4357-b3a8-ecc0567e1e8d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.778134 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-scripts" (OuterVolumeSpecName: "scripts") pod "73c592d5-ad34-4357-b3a8-ecc0567e1e8d" (UID: "73c592d5-ad34-4357-b3a8-ecc0567e1e8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.781642 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73c592d5-ad34-4357-b3a8-ecc0567e1e8d" (UID: "73c592d5-ad34-4357-b3a8-ecc0567e1e8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.843475 4914 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.843516 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.843532 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.843546 4914 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.843559 4914 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.843570 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsfvn\" (UniqueName: \"kubernetes.io/projected/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-kube-api-access-jsfvn\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:02 crc kubenswrapper[4914]: I0130 21:33:02.843582 4914 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/73c592d5-ad34-4357-b3a8-ecc0567e1e8d-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.218888 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4n6nd" event={"ID":"73c592d5-ad34-4357-b3a8-ecc0567e1e8d","Type":"ContainerDied","Data":"5d9833d7e18d12f52af707bafeae69e30578e24be8545685ede1274142b848cb"} Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.218950 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d9833d7e18d12f52af707bafeae69e30578e24be8545685ede1274142b848cb" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.219306 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4n6nd" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.222497 4914 generic.go:334] "Generic (PLEG): container finished" podID="d6d8a7cc-b7c4-408f-a50d-9abb436695d8" containerID="dca335e33d26033d2a549eec28d093f1e5329310df9ddac334aa3550e922e28d" exitCode=0 Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.223159 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7688-account-create-update-m7mwk" event={"ID":"d6d8a7cc-b7c4-408f-a50d-9abb436695d8","Type":"ContainerDied","Data":"dca335e33d26033d2a549eec28d093f1e5329310df9ddac334aa3550e922e28d"} Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.733344 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tvrvx" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.871086 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ed24889-cc02-4c4d-ba7f-7b397f1516ae-operator-scripts\") pod \"2ed24889-cc02-4c4d-ba7f-7b397f1516ae\" (UID: \"2ed24889-cc02-4c4d-ba7f-7b397f1516ae\") " Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.871246 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48tw2\" (UniqueName: \"kubernetes.io/projected/2ed24889-cc02-4c4d-ba7f-7b397f1516ae-kube-api-access-48tw2\") pod \"2ed24889-cc02-4c4d-ba7f-7b397f1516ae\" (UID: \"2ed24889-cc02-4c4d-ba7f-7b397f1516ae\") " Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.871877 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ed24889-cc02-4c4d-ba7f-7b397f1516ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ed24889-cc02-4c4d-ba7f-7b397f1516ae" (UID: "2ed24889-cc02-4c4d-ba7f-7b397f1516ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.875166 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jfgqq" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.878339 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed24889-cc02-4c4d-ba7f-7b397f1516ae-kube-api-access-48tw2" (OuterVolumeSpecName: "kube-api-access-48tw2") pod "2ed24889-cc02-4c4d-ba7f-7b397f1516ae" (UID: "2ed24889-cc02-4c4d-ba7f-7b397f1516ae"). InnerVolumeSpecName "kube-api-access-48tw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.878797 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1945-account-create-update-crf4j" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.972942 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z82jw\" (UniqueName: \"kubernetes.io/projected/3d29c2c9-d6e0-4691-b53a-ebd2205fbba3-kube-api-access-z82jw\") pod \"3d29c2c9-d6e0-4691-b53a-ebd2205fbba3\" (UID: \"3d29c2c9-d6e0-4691-b53a-ebd2205fbba3\") " Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.973360 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656-operator-scripts\") pod \"0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656\" (UID: \"0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656\") " Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.973516 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq4vh\" (UniqueName: \"kubernetes.io/projected/0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656-kube-api-access-kq4vh\") pod \"0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656\" (UID: \"0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656\") " Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.973563 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d29c2c9-d6e0-4691-b53a-ebd2205fbba3-operator-scripts\") pod \"3d29c2c9-d6e0-4691-b53a-ebd2205fbba3\" (UID: \"3d29c2c9-d6e0-4691-b53a-ebd2205fbba3\") " Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.973857 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656" (UID: "0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.974303 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d29c2c9-d6e0-4691-b53a-ebd2205fbba3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d29c2c9-d6e0-4691-b53a-ebd2205fbba3" (UID: "3d29c2c9-d6e0-4691-b53a-ebd2205fbba3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.974415 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.974438 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ed24889-cc02-4c4d-ba7f-7b397f1516ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.974450 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d29c2c9-d6e0-4691-b53a-ebd2205fbba3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.974463 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48tw2\" (UniqueName: \"kubernetes.io/projected/2ed24889-cc02-4c4d-ba7f-7b397f1516ae-kube-api-access-48tw2\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.977057 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656-kube-api-access-kq4vh" (OuterVolumeSpecName: "kube-api-access-kq4vh") pod "0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656" (UID: "0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656"). InnerVolumeSpecName "kube-api-access-kq4vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:03 crc kubenswrapper[4914]: I0130 21:33:03.977422 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d29c2c9-d6e0-4691-b53a-ebd2205fbba3-kube-api-access-z82jw" (OuterVolumeSpecName: "kube-api-access-z82jw") pod "3d29c2c9-d6e0-4691-b53a-ebd2205fbba3" (UID: "3d29c2c9-d6e0-4691-b53a-ebd2205fbba3"). InnerVolumeSpecName "kube-api-access-z82jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.076566 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq4vh\" (UniqueName: \"kubernetes.io/projected/0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656-kube-api-access-kq4vh\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.076599 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z82jw\" (UniqueName: \"kubernetes.io/projected/3d29c2c9-d6e0-4691-b53a-ebd2205fbba3-kube-api-access-z82jw\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.166758 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vdgd5"] Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.172750 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vdgd5"] Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.230617 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jfgqq" event={"ID":"0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656","Type":"ContainerDied","Data":"192cfd544baf2f39e94bc320f33698060c8c6601b07dff23c755d7f4d60d245c"} Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.230654 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="192cfd544baf2f39e94bc320f33698060c8c6601b07dff23c755d7f4d60d245c" Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.230722 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jfgqq" Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.232818 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1945-account-create-update-crf4j" event={"ID":"3d29c2c9-d6e0-4691-b53a-ebd2205fbba3","Type":"ContainerDied","Data":"75cb55c7cd2a9a1977930d43a02ba5ee29f7cee662a5bdc2e223151b889d609b"} Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.232853 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75cb55c7cd2a9a1977930d43a02ba5ee29f7cee662a5bdc2e223151b889d609b" Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.232906 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1945-account-create-update-crf4j" Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.235844 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tvrvx" Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.236592 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tvrvx" event={"ID":"2ed24889-cc02-4c4d-ba7f-7b397f1516ae","Type":"ContainerDied","Data":"295ae8428df19bf001843640ca28fa96d4d85e9ea608dd2287c2ddab61148223"} Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.236613 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="295ae8428df19bf001843640ca28fa96d4d85e9ea608dd2287c2ddab61148223" Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.365889 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.521379 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7688-account-create-update-m7mwk" Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.687443 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffzq2\" (UniqueName: \"kubernetes.io/projected/d6d8a7cc-b7c4-408f-a50d-9abb436695d8-kube-api-access-ffzq2\") pod \"d6d8a7cc-b7c4-408f-a50d-9abb436695d8\" (UID: \"d6d8a7cc-b7c4-408f-a50d-9abb436695d8\") " Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.687542 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6d8a7cc-b7c4-408f-a50d-9abb436695d8-operator-scripts\") pod \"d6d8a7cc-b7c4-408f-a50d-9abb436695d8\" (UID: \"d6d8a7cc-b7c4-408f-a50d-9abb436695d8\") " Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.688643 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6d8a7cc-b7c4-408f-a50d-9abb436695d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6d8a7cc-b7c4-408f-a50d-9abb436695d8" (UID: "d6d8a7cc-b7c4-408f-a50d-9abb436695d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.694050 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6d8a7cc-b7c4-408f-a50d-9abb436695d8-kube-api-access-ffzq2" (OuterVolumeSpecName: "kube-api-access-ffzq2") pod "d6d8a7cc-b7c4-408f-a50d-9abb436695d8" (UID: "d6d8a7cc-b7c4-408f-a50d-9abb436695d8"). InnerVolumeSpecName "kube-api-access-ffzq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.789395 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffzq2\" (UniqueName: \"kubernetes.io/projected/d6d8a7cc-b7c4-408f-a50d-9abb436695d8-kube-api-access-ffzq2\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:04 crc kubenswrapper[4914]: I0130 21:33:04.789431 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6d8a7cc-b7c4-408f-a50d-9abb436695d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:05 crc kubenswrapper[4914]: I0130 21:33:05.247263 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7688-account-create-update-m7mwk" event={"ID":"d6d8a7cc-b7c4-408f-a50d-9abb436695d8","Type":"ContainerDied","Data":"6f3b77c407cb072881967172c62eeec549f17752c2f3ae27d6b5c2abe15776d5"} Jan 30 21:33:05 crc kubenswrapper[4914]: I0130 21:33:05.247500 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f3b77c407cb072881967172c62eeec549f17752c2f3ae27d6b5c2abe15776d5" Jan 30 21:33:05 crc kubenswrapper[4914]: I0130 21:33:05.247320 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7688-account-create-update-m7mwk" Jan 30 21:33:05 crc kubenswrapper[4914]: I0130 21:33:05.831211 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bd50441-346e-4886-8273-5196b7d5e35d" path="/var/lib/kubelet/pods/5bd50441-346e-4886-8273-5196b7d5e35d/volumes" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.274878 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-lx7cf"] Jan 30 21:33:06 crc kubenswrapper[4914]: E0130 21:33:06.275193 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87a04c4b-a588-403c-b989-d1eb41d4cd13" containerName="mariadb-database-create" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275208 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a04c4b-a588-403c-b989-d1eb41d4cd13" containerName="mariadb-database-create" Jan 30 21:33:06 crc kubenswrapper[4914]: E0130 21:33:06.275216 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0e095e-5ba2-4450-9006-3d471fd30225" containerName="mariadb-account-create-update" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275222 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0e095e-5ba2-4450-9006-3d471fd30225" containerName="mariadb-account-create-update" Jan 30 21:33:06 crc kubenswrapper[4914]: E0130 21:33:06.275235 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6d8a7cc-b7c4-408f-a50d-9abb436695d8" containerName="mariadb-account-create-update" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275241 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6d8a7cc-b7c4-408f-a50d-9abb436695d8" containerName="mariadb-account-create-update" Jan 30 21:33:06 crc kubenswrapper[4914]: E0130 21:33:06.275258 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd50441-346e-4886-8273-5196b7d5e35d" containerName="mariadb-account-create-update" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275264 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd50441-346e-4886-8273-5196b7d5e35d" containerName="mariadb-account-create-update" Jan 30 21:33:06 crc kubenswrapper[4914]: E0130 21:33:06.275272 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d29c2c9-d6e0-4691-b53a-ebd2205fbba3" containerName="mariadb-account-create-update" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275278 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d29c2c9-d6e0-4691-b53a-ebd2205fbba3" containerName="mariadb-account-create-update" Jan 30 21:33:06 crc kubenswrapper[4914]: E0130 21:33:06.275286 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c592d5-ad34-4357-b3a8-ecc0567e1e8d" containerName="swift-ring-rebalance" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275291 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c592d5-ad34-4357-b3a8-ecc0567e1e8d" containerName="swift-ring-rebalance" Jan 30 21:33:06 crc kubenswrapper[4914]: E0130 21:33:06.275299 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656" containerName="mariadb-database-create" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275305 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656" containerName="mariadb-database-create" Jan 30 21:33:06 crc kubenswrapper[4914]: E0130 21:33:06.275318 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed24889-cc02-4c4d-ba7f-7b397f1516ae" containerName="mariadb-database-create" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275324 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed24889-cc02-4c4d-ba7f-7b397f1516ae" containerName="mariadb-database-create" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275463 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656" containerName="mariadb-database-create" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275472 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bd50441-346e-4886-8273-5196b7d5e35d" containerName="mariadb-account-create-update" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275480 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6d8a7cc-b7c4-408f-a50d-9abb436695d8" containerName="mariadb-account-create-update" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275492 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d29c2c9-d6e0-4691-b53a-ebd2205fbba3" containerName="mariadb-account-create-update" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275504 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed24889-cc02-4c4d-ba7f-7b397f1516ae" containerName="mariadb-database-create" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275511 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0e095e-5ba2-4450-9006-3d471fd30225" containerName="mariadb-account-create-update" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275521 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="87a04c4b-a588-403c-b989-d1eb41d4cd13" containerName="mariadb-database-create" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.275531 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c592d5-ad34-4357-b3a8-ecc0567e1e8d" containerName="swift-ring-rebalance" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.278748 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.280830 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2msrh" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.281268 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.294812 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lx7cf"] Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.445763 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-db-sync-config-data\") pod \"glance-db-sync-lx7cf\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.446829 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csrm7\" (UniqueName: \"kubernetes.io/projected/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-kube-api-access-csrm7\") pod \"glance-db-sync-lx7cf\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.446909 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-config-data\") pod \"glance-db-sync-lx7cf\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.446989 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-combined-ca-bundle\") pod \"glance-db-sync-lx7cf\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.548005 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-db-sync-config-data\") pod \"glance-db-sync-lx7cf\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.548095 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csrm7\" (UniqueName: \"kubernetes.io/projected/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-kube-api-access-csrm7\") pod \"glance-db-sync-lx7cf\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.548137 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-config-data\") pod \"glance-db-sync-lx7cf\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.548200 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-combined-ca-bundle\") pod \"glance-db-sync-lx7cf\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.556903 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-combined-ca-bundle\") pod \"glance-db-sync-lx7cf\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.557397 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-config-data\") pod \"glance-db-sync-lx7cf\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.558186 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-db-sync-config-data\") pod \"glance-db-sync-lx7cf\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.565022 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csrm7\" (UniqueName: \"kubernetes.io/projected/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-kube-api-access-csrm7\") pod \"glance-db-sync-lx7cf\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:06 crc kubenswrapper[4914]: I0130 21:33:06.668143 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:07 crc kubenswrapper[4914]: I0130 21:33:07.241192 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lx7cf"] Jan 30 21:33:07 crc kubenswrapper[4914]: W0130 21:33:07.244731 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf598ffa_7a5d_4a7b_a547_cbf01cdefc25.slice/crio-c050940a536bade19febd4bc1957897cafb8c3ebf0ae0d95f305456518d908d5 WatchSource:0}: Error finding container c050940a536bade19febd4bc1957897cafb8c3ebf0ae0d95f305456518d908d5: Status 404 returned error can't find the container with id c050940a536bade19febd4bc1957897cafb8c3ebf0ae0d95f305456518d908d5 Jan 30 21:33:07 crc kubenswrapper[4914]: I0130 21:33:07.260671 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lx7cf" event={"ID":"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25","Type":"ContainerStarted","Data":"c050940a536bade19febd4bc1957897cafb8c3ebf0ae0d95f305456518d908d5"} Jan 30 21:33:07 crc kubenswrapper[4914]: I0130 21:33:07.264059 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1","Type":"ContainerStarted","Data":"a369e547a341b51f0c0ded3f0c212eabe3b4646b34f50297b3aa4c377d245030"} Jan 30 21:33:07 crc kubenswrapper[4914]: I0130 21:33:07.292208 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.106799679 podStartE2EDuration="1m15.292187424s" podCreationTimestamp="2026-01-30 21:31:52 +0000 UTC" firstStartedPulling="2026-01-30 21:32:07.266101561 +0000 UTC m=+1060.704738322" lastFinishedPulling="2026-01-30 21:33:06.451489306 +0000 UTC m=+1119.890126067" observedRunningTime="2026-01-30 21:33:07.290896583 +0000 UTC m=+1120.729533344" watchObservedRunningTime="2026-01-30 21:33:07.292187424 +0000 UTC m=+1120.730824195" Jan 30 21:33:07 crc kubenswrapper[4914]: I0130 21:33:07.769486 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rdzm9" podUID="3f063a16-987d-4378-b889-966755034c3e" containerName="ovn-controller" probeResult="failure" output=< Jan 30 21:33:07 crc kubenswrapper[4914]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 21:33:07 crc kubenswrapper[4914]: > Jan 30 21:33:08 crc kubenswrapper[4914]: I0130 21:33:08.946669 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:08 crc kubenswrapper[4914]: I0130 21:33:08.946954 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:08 crc kubenswrapper[4914]: I0130 21:33:08.962700 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.182250 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-dqj2s"] Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.183543 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dqj2s" Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.185529 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.190066 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dqj2s"] Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.281420 4914 generic.go:334] "Generic (PLEG): container finished" podID="c506e0ae-e4b2-4cd7-87ea-bc10619f874e" containerID="004c6c908d0d5695cba3e148480eb2debb05fb64209a0e7961d73f9232c504b0" exitCode=0 Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.281727 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c506e0ae-e4b2-4cd7-87ea-bc10619f874e","Type":"ContainerDied","Data":"004c6c908d0d5695cba3e148480eb2debb05fb64209a0e7961d73f9232c504b0"} Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.283268 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.295409 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmr5m\" (UniqueName: \"kubernetes.io/projected/7df1cd5f-9cc1-4549-ba23-76c21805e479-kube-api-access-vmr5m\") pod \"root-account-create-update-dqj2s\" (UID: \"7df1cd5f-9cc1-4549-ba23-76c21805e479\") " pod="openstack/root-account-create-update-dqj2s" Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.295549 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df1cd5f-9cc1-4549-ba23-76c21805e479-operator-scripts\") pod \"root-account-create-update-dqj2s\" (UID: \"7df1cd5f-9cc1-4549-ba23-76c21805e479\") " pod="openstack/root-account-create-update-dqj2s" Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.397847 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmr5m\" (UniqueName: \"kubernetes.io/projected/7df1cd5f-9cc1-4549-ba23-76c21805e479-kube-api-access-vmr5m\") pod \"root-account-create-update-dqj2s\" (UID: \"7df1cd5f-9cc1-4549-ba23-76c21805e479\") " pod="openstack/root-account-create-update-dqj2s" Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.398069 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df1cd5f-9cc1-4549-ba23-76c21805e479-operator-scripts\") pod \"root-account-create-update-dqj2s\" (UID: \"7df1cd5f-9cc1-4549-ba23-76c21805e479\") " pod="openstack/root-account-create-update-dqj2s" Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.400978 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df1cd5f-9cc1-4549-ba23-76c21805e479-operator-scripts\") pod \"root-account-create-update-dqj2s\" (UID: \"7df1cd5f-9cc1-4549-ba23-76c21805e479\") " pod="openstack/root-account-create-update-dqj2s" Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.433479 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmr5m\" (UniqueName: \"kubernetes.io/projected/7df1cd5f-9cc1-4549-ba23-76c21805e479-kube-api-access-vmr5m\") pod \"root-account-create-update-dqj2s\" (UID: \"7df1cd5f-9cc1-4549-ba23-76c21805e479\") " pod="openstack/root-account-create-update-dqj2s" Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.510258 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dqj2s" Jan 30 21:33:09 crc kubenswrapper[4914]: I0130 21:33:09.955107 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dqj2s"] Jan 30 21:33:09 crc kubenswrapper[4914]: W0130 21:33:09.955309 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7df1cd5f_9cc1_4549_ba23_76c21805e479.slice/crio-b5de0379a3992e808fe8dc948d766670e618b13dc825ee722adc6dd1928dd31f WatchSource:0}: Error finding container b5de0379a3992e808fe8dc948d766670e618b13dc825ee722adc6dd1928dd31f: Status 404 returned error can't find the container with id b5de0379a3992e808fe8dc948d766670e618b13dc825ee722adc6dd1928dd31f Jan 30 21:33:10 crc kubenswrapper[4914]: I0130 21:33:10.312853 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c506e0ae-e4b2-4cd7-87ea-bc10619f874e","Type":"ContainerStarted","Data":"70cbbbbafe8cea99c64866db7ecbc122e676e7f4c10a9cddda720373900c0b08"} Jan 30 21:33:10 crc kubenswrapper[4914]: I0130 21:33:10.313166 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 30 21:33:10 crc kubenswrapper[4914]: I0130 21:33:10.317687 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dqj2s" event={"ID":"7df1cd5f-9cc1-4549-ba23-76c21805e479","Type":"ContainerStarted","Data":"9c23f1cb43b6c4e8d1730f0b6363775edd8078930f6ddfa0cb5a4ccbc11d4a1e"} Jan 30 21:33:10 crc kubenswrapper[4914]: I0130 21:33:10.318969 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dqj2s" event={"ID":"7df1cd5f-9cc1-4549-ba23-76c21805e479","Type":"ContainerStarted","Data":"b5de0379a3992e808fe8dc948d766670e618b13dc825ee722adc6dd1928dd31f"} Jan 30 21:33:10 crc kubenswrapper[4914]: I0130 21:33:10.321833 4914 generic.go:334] "Generic (PLEG): container finished" podID="f394410a-5ff7-4a0c-84ec-4b60c63c707c" containerID="f8343c308380c5164c5ade6d747612b6694879f8d31de8cbd0fbfc60d77d1c07" exitCode=0 Jan 30 21:33:10 crc kubenswrapper[4914]: I0130 21:33:10.321866 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f394410a-5ff7-4a0c-84ec-4b60c63c707c","Type":"ContainerDied","Data":"f8343c308380c5164c5ade6d747612b6694879f8d31de8cbd0fbfc60d77d1c07"} Jan 30 21:33:10 crc kubenswrapper[4914]: I0130 21:33:10.342888 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=56.647172307 podStartE2EDuration="1m24.342870553s" podCreationTimestamp="2026-01-30 21:31:46 +0000 UTC" firstStartedPulling="2026-01-30 21:32:06.533600613 +0000 UTC m=+1059.972237374" lastFinishedPulling="2026-01-30 21:32:34.229298859 +0000 UTC m=+1087.667935620" observedRunningTime="2026-01-30 21:33:10.337211886 +0000 UTC m=+1123.775848657" watchObservedRunningTime="2026-01-30 21:33:10.342870553 +0000 UTC m=+1123.781507314" Jan 30 21:33:10 crc kubenswrapper[4914]: I0130 21:33:10.360046 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-dqj2s" podStartSLOduration=1.360010615 podStartE2EDuration="1.360010615s" podCreationTimestamp="2026-01-30 21:33:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:10.351868609 +0000 UTC m=+1123.790505370" watchObservedRunningTime="2026-01-30 21:33:10.360010615 +0000 UTC m=+1123.798647376" Jan 30 21:33:11 crc kubenswrapper[4914]: I0130 21:33:11.333442 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f394410a-5ff7-4a0c-84ec-4b60c63c707c","Type":"ContainerStarted","Data":"81433ff0f628af87671a18cfd72c656192e9c4f0ecad8d43815099cf1bfc51c1"} Jan 30 21:33:11 crc kubenswrapper[4914]: I0130 21:33:11.359017 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371951.495775 podStartE2EDuration="1m25.358999974s" podCreationTimestamp="2026-01-30 21:31:46 +0000 UTC" firstStartedPulling="2026-01-30 21:32:05.90929364 +0000 UTC m=+1059.347930401" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:11.352773104 +0000 UTC m=+1124.791409865" watchObservedRunningTime="2026-01-30 21:33:11.358999974 +0000 UTC m=+1124.797636725" Jan 30 21:33:12 crc kubenswrapper[4914]: I0130 21:33:12.343177 4914 generic.go:334] "Generic (PLEG): container finished" podID="7df1cd5f-9cc1-4549-ba23-76c21805e479" containerID="9c23f1cb43b6c4e8d1730f0b6363775edd8078930f6ddfa0cb5a4ccbc11d4a1e" exitCode=0 Jan 30 21:33:12 crc kubenswrapper[4914]: I0130 21:33:12.343265 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dqj2s" event={"ID":"7df1cd5f-9cc1-4549-ba23-76c21805e479","Type":"ContainerDied","Data":"9c23f1cb43b6c4e8d1730f0b6363775edd8078930f6ddfa0cb5a4ccbc11d4a1e"} Jan 30 21:33:12 crc kubenswrapper[4914]: I0130 21:33:12.534573 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 21:33:12 crc kubenswrapper[4914]: I0130 21:33:12.534854 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="prometheus" containerID="cri-o://1dab7a42f9d9c2fda2084cab46af4fe31c657028a26e53b83a24bc3de3ccf0f8" gracePeriod=600 Jan 30 21:33:12 crc kubenswrapper[4914]: I0130 21:33:12.534935 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="config-reloader" containerID="cri-o://16d66c3ee650e0ad391e0dbde55efa72c1d2afa5033907e89093e75d51aee2c9" gracePeriod=600 Jan 30 21:33:12 crc kubenswrapper[4914]: I0130 21:33:12.534935 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="thanos-sidecar" containerID="cri-o://a369e547a341b51f0c0ded3f0c212eabe3b4646b34f50297b3aa4c377d245030" gracePeriod=600 Jan 30 21:33:12 crc kubenswrapper[4914]: I0130 21:33:12.742457 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rdzm9" podUID="3f063a16-987d-4378-b889-966755034c3e" containerName="ovn-controller" probeResult="failure" output=< Jan 30 21:33:12 crc kubenswrapper[4914]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 21:33:12 crc kubenswrapper[4914]: > Jan 30 21:33:13 crc kubenswrapper[4914]: I0130 21:33:13.353946 4914 generic.go:334] "Generic (PLEG): container finished" podID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerID="a369e547a341b51f0c0ded3f0c212eabe3b4646b34f50297b3aa4c377d245030" exitCode=0 Jan 30 21:33:13 crc kubenswrapper[4914]: I0130 21:33:13.354297 4914 generic.go:334] "Generic (PLEG): container finished" podID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerID="16d66c3ee650e0ad391e0dbde55efa72c1d2afa5033907e89093e75d51aee2c9" exitCode=0 Jan 30 21:33:13 crc kubenswrapper[4914]: I0130 21:33:13.354308 4914 generic.go:334] "Generic (PLEG): container finished" podID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerID="1dab7a42f9d9c2fda2084cab46af4fe31c657028a26e53b83a24bc3de3ccf0f8" exitCode=0 Jan 30 21:33:13 crc kubenswrapper[4914]: I0130 21:33:13.354029 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1","Type":"ContainerDied","Data":"a369e547a341b51f0c0ded3f0c212eabe3b4646b34f50297b3aa4c377d245030"} Jan 30 21:33:13 crc kubenswrapper[4914]: I0130 21:33:13.354431 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1","Type":"ContainerDied","Data":"16d66c3ee650e0ad391e0dbde55efa72c1d2afa5033907e89093e75d51aee2c9"} Jan 30 21:33:13 crc kubenswrapper[4914]: I0130 21:33:13.354452 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1","Type":"ContainerDied","Data":"1dab7a42f9d9c2fda2084cab46af4fe31c657028a26e53b83a24bc3de3ccf0f8"} Jan 30 21:33:14 crc kubenswrapper[4914]: I0130 21:33:14.356726 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 30 21:33:15 crc kubenswrapper[4914]: I0130 21:33:15.822599 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:33:15 crc kubenswrapper[4914]: I0130 21:33:15.830950 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a754950-b587-4c0a-85ed-e9669582ea2c-etc-swift\") pod \"swift-storage-0\" (UID: \"3a754950-b587-4c0a-85ed-e9669582ea2c\") " pod="openstack/swift-storage-0" Jan 30 21:33:15 crc kubenswrapper[4914]: I0130 21:33:15.966526 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 21:33:16 crc kubenswrapper[4914]: I0130 21:33:16.947097 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.114:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 21:33:17 crc kubenswrapper[4914]: I0130 21:33:17.761523 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rdzm9" podUID="3f063a16-987d-4378-b889-966755034c3e" containerName="ovn-controller" probeResult="failure" output=< Jan 30 21:33:17 crc kubenswrapper[4914]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 21:33:17 crc kubenswrapper[4914]: > Jan 30 21:33:17 crc kubenswrapper[4914]: I0130 21:33:17.807963 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:33:17 crc kubenswrapper[4914]: I0130 21:33:17.808965 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kv2g9" Jan 30 21:33:17 crc kubenswrapper[4914]: I0130 21:33:17.995790 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.039767 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rdzm9-config-l564h"] Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.040880 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.045039 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.058170 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rdzm9-config-l564h"] Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.173435 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-run\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.173518 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsxpr\" (UniqueName: \"kubernetes.io/projected/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-kube-api-access-fsxpr\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.173540 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-scripts\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.173562 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-additional-scripts\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.173587 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-run-ovn\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.173603 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-log-ovn\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.275665 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-additional-scripts\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.275790 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-run-ovn\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.275813 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-log-ovn\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.275941 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-run\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.276019 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsxpr\" (UniqueName: \"kubernetes.io/projected/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-kube-api-access-fsxpr\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.276043 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-scripts\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.276508 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-run\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.276536 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-run-ovn\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.276862 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-log-ovn\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.277991 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-additional-scripts\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.278913 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-scripts\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.297438 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsxpr\" (UniqueName: \"kubernetes.io/projected/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-kube-api-access-fsxpr\") pod \"ovn-controller-rdzm9-config-l564h\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:18 crc kubenswrapper[4914]: I0130 21:33:18.370768 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.725762 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.730493 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dqj2s" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.835519 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-web-config\") pod \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.835574 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-0\") pod \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.835627 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-thanos-prometheus-http-client-file\") pod \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.835840 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-tls-assets\") pod \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.835879 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-config\") pod \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.835910 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df1cd5f-9cc1-4549-ba23-76c21805e479-operator-scripts\") pod \"7df1cd5f-9cc1-4549-ba23-76c21805e479\" (UID: \"7df1cd5f-9cc1-4549-ba23-76c21805e479\") " Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.835933 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmr5m\" (UniqueName: \"kubernetes.io/projected/7df1cd5f-9cc1-4549-ba23-76c21805e479-kube-api-access-vmr5m\") pod \"7df1cd5f-9cc1-4549-ba23-76c21805e479\" (UID: \"7df1cd5f-9cc1-4549-ba23-76c21805e479\") " Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.836061 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\") pod \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.836081 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn2n5\" (UniqueName: \"kubernetes.io/projected/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-kube-api-access-mn2n5\") pod \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.836123 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-config-out\") pod \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.836160 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-2\") pod \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.836185 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-1\") pod \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\" (UID: \"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1\") " Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.837485 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" (UID: "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.837530 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" (UID: "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.840997 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7df1cd5f-9cc1-4549-ba23-76c21805e479-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7df1cd5f-9cc1-4549-ba23-76c21805e479" (UID: "7df1cd5f-9cc1-4549-ba23-76c21805e479"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.842245 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" (UID: "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.844592 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-config-out" (OuterVolumeSpecName: "config-out") pod "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" (UID: "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.847909 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" (UID: "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.847910 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-config" (OuterVolumeSpecName: "config") pod "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" (UID: "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.850927 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" (UID: "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.856223 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-kube-api-access-mn2n5" (OuterVolumeSpecName: "kube-api-access-mn2n5") pod "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" (UID: "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1"). InnerVolumeSpecName "kube-api-access-mn2n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.859805 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df1cd5f-9cc1-4549-ba23-76c21805e479-kube-api-access-vmr5m" (OuterVolumeSpecName: "kube-api-access-vmr5m") pod "7df1cd5f-9cc1-4549-ba23-76c21805e479" (UID: "7df1cd5f-9cc1-4549-ba23-76c21805e479"). InnerVolumeSpecName "kube-api-access-vmr5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.869286 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" (UID: "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1"). InnerVolumeSpecName "pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.895145 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-web-config" (OuterVolumeSpecName: "web-config") pod "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" (UID: "3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.945453 4914 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.945488 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.945500 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df1cd5f-9cc1-4549-ba23-76c21805e479-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.945513 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmr5m\" (UniqueName: \"kubernetes.io/projected/7df1cd5f-9cc1-4549-ba23-76c21805e479-kube-api-access-vmr5m\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.945540 4914 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\") on node \"crc\" " Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.945553 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn2n5\" (UniqueName: \"kubernetes.io/projected/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-kube-api-access-mn2n5\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.945565 4914 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-config-out\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.945576 4914 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.945588 4914 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.945601 4914 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-web-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.945612 4914 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.945627 4914 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.975732 4914 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 21:33:20 crc kubenswrapper[4914]: I0130 21:33:20.976014 4914 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9") on node "crc" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.047092 4914 reconciler_common.go:293] "Volume detached for volume \"pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.119195 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rdzm9-config-l564h"] Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.231757 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.469654 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dqj2s" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.469653 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dqj2s" event={"ID":"7df1cd5f-9cc1-4549-ba23-76c21805e479","Type":"ContainerDied","Data":"b5de0379a3992e808fe8dc948d766670e618b13dc825ee722adc6dd1928dd31f"} Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.470032 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5de0379a3992e808fe8dc948d766670e618b13dc825ee722adc6dd1928dd31f" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.471823 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lx7cf" event={"ID":"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25","Type":"ContainerStarted","Data":"11c7a966dafdee6a0789c632fb8ad3b29eeadbcb5982dcdbe340ca69caf078a4"} Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.474228 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"23182a7bdc863326c3e40cf035b3eb5ac5a44d0233303a779c0a4bfa565f0a69"} Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.476939 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1","Type":"ContainerDied","Data":"20ab92bdccb13fd50a7824249fa8712bd7ff89a2d162d0a4859a3093a334ca99"} Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.476969 4914 scope.go:117] "RemoveContainer" containerID="a369e547a341b51f0c0ded3f0c212eabe3b4646b34f50297b3aa4c377d245030" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.477084 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.486551 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rdzm9-config-l564h" event={"ID":"b65ca7e8-9e11-447c-8617-6d9e0fe5e771","Type":"ContainerStarted","Data":"ddd6ea585279fb45564d13808e7fcb6502f0356ac4893d7eb01fcf04b4dda22a"} Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.486605 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rdzm9-config-l564h" event={"ID":"b65ca7e8-9e11-447c-8617-6d9e0fe5e771","Type":"ContainerStarted","Data":"946b51b52d22030b326fcd42eb388cd93ee852866e92ee65599ddeebc3f041f9"} Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.497588 4914 scope.go:117] "RemoveContainer" containerID="16d66c3ee650e0ad391e0dbde55efa72c1d2afa5033907e89093e75d51aee2c9" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.498272 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-lx7cf" podStartSLOduration=2.121824717 podStartE2EDuration="15.498238584s" podCreationTimestamp="2026-01-30 21:33:06 +0000 UTC" firstStartedPulling="2026-01-30 21:33:07.24673489 +0000 UTC m=+1120.685371651" lastFinishedPulling="2026-01-30 21:33:20.623148737 +0000 UTC m=+1134.061785518" observedRunningTime="2026-01-30 21:33:21.491647965 +0000 UTC m=+1134.930284726" watchObservedRunningTime="2026-01-30 21:33:21.498238584 +0000 UTC m=+1134.936875345" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.511592 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rdzm9-config-l564h" podStartSLOduration=3.511571585 podStartE2EDuration="3.511571585s" podCreationTimestamp="2026-01-30 21:33:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:21.50680084 +0000 UTC m=+1134.945437601" watchObservedRunningTime="2026-01-30 21:33:21.511571585 +0000 UTC m=+1134.950208346" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.555336 4914 scope.go:117] "RemoveContainer" containerID="1dab7a42f9d9c2fda2084cab46af4fe31c657028a26e53b83a24bc3de3ccf0f8" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.559867 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.571132 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.586941 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 21:33:21 crc kubenswrapper[4914]: E0130 21:33:21.587394 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="thanos-sidecar" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.588432 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="thanos-sidecar" Jan 30 21:33:21 crc kubenswrapper[4914]: E0130 21:33:21.588453 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="config-reloader" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.588460 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="config-reloader" Jan 30 21:33:21 crc kubenswrapper[4914]: E0130 21:33:21.588516 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="prometheus" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.588522 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="prometheus" Jan 30 21:33:21 crc kubenswrapper[4914]: E0130 21:33:21.588533 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="init-config-reloader" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.588539 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="init-config-reloader" Jan 30 21:33:21 crc kubenswrapper[4914]: E0130 21:33:21.588550 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df1cd5f-9cc1-4549-ba23-76c21805e479" containerName="mariadb-account-create-update" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.588556 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df1cd5f-9cc1-4549-ba23-76c21805e479" containerName="mariadb-account-create-update" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.588733 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="config-reloader" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.588878 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df1cd5f-9cc1-4549-ba23-76c21805e479" containerName="mariadb-account-create-update" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.588887 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="prometheus" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.588904 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="thanos-sidecar" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.598809 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.601592 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.602875 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.603100 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-8cvf7" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.606923 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.607143 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.607264 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.607400 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.607631 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.607942 4914 scope.go:117] "RemoveContainer" containerID="12aae14a6f79949923517112307201ed4c40a81d30ca3db9cb5e044d91f2e9ed" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.609502 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.627003 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.658765 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.658819 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6019a332-1bf4-40c9-9ed7-6956d8532e9c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.658841 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/6019a332-1bf4-40c9-9ed7-6956d8532e9c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.658904 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.658936 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.658972 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/6019a332-1bf4-40c9-9ed7-6956d8532e9c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.659027 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.659052 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.659073 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.659092 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6019a332-1bf4-40c9-9ed7-6956d8532e9c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.659113 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6019a332-1bf4-40c9-9ed7-6956d8532e9c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.659134 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-config\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.659154 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6qrw\" (UniqueName: \"kubernetes.io/projected/6019a332-1bf4-40c9-9ed7-6956d8532e9c-kube-api-access-s6qrw\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.760238 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/6019a332-1bf4-40c9-9ed7-6956d8532e9c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.760522 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.760552 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.760576 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.760597 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6019a332-1bf4-40c9-9ed7-6956d8532e9c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.760620 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6019a332-1bf4-40c9-9ed7-6956d8532e9c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.760640 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-config\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.760661 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6qrw\" (UniqueName: \"kubernetes.io/projected/6019a332-1bf4-40c9-9ed7-6956d8532e9c-kube-api-access-s6qrw\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.760686 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.760722 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6019a332-1bf4-40c9-9ed7-6956d8532e9c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.760742 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/6019a332-1bf4-40c9-9ed7-6956d8532e9c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.760818 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.760860 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.761011 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/6019a332-1bf4-40c9-9ed7-6956d8532e9c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.763788 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/6019a332-1bf4-40c9-9ed7-6956d8532e9c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.764963 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6019a332-1bf4-40c9-9ed7-6956d8532e9c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.766325 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.767229 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6019a332-1bf4-40c9-9ed7-6956d8532e9c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.767546 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.767940 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.767968 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-config\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.769578 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.777135 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6019a332-1bf4-40c9-9ed7-6956d8532e9c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.778219 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6019a332-1bf4-40c9-9ed7-6956d8532e9c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.786296 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.786410 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/81f3275c3b55a1f740c68491d4a52891729addfc87ac8642d59b960166a498d8/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.795452 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6qrw\" (UniqueName: \"kubernetes.io/projected/6019a332-1bf4-40c9-9ed7-6956d8532e9c-kube-api-access-s6qrw\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.828305 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" path="/var/lib/kubelet/pods/3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1/volumes" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.845053 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a7b6a2a5-baeb-4420-9346-01c329f267c9\") pod \"prometheus-metric-storage-0\" (UID: \"6019a332-1bf4-40c9-9ed7-6956d8532e9c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.946638 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="3e2fd3a0-3c39-4fda-aa23-1a7d79f0d8e1" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.114:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 21:33:21 crc kubenswrapper[4914]: I0130 21:33:21.946559 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:22 crc kubenswrapper[4914]: I0130 21:33:22.433055 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 21:33:22 crc kubenswrapper[4914]: I0130 21:33:22.508584 4914 generic.go:334] "Generic (PLEG): container finished" podID="b65ca7e8-9e11-447c-8617-6d9e0fe5e771" containerID="ddd6ea585279fb45564d13808e7fcb6502f0356ac4893d7eb01fcf04b4dda22a" exitCode=0 Jan 30 21:33:22 crc kubenswrapper[4914]: I0130 21:33:22.508662 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rdzm9-config-l564h" event={"ID":"b65ca7e8-9e11-447c-8617-6d9e0fe5e771","Type":"ContainerDied","Data":"ddd6ea585279fb45564d13808e7fcb6502f0356ac4893d7eb01fcf04b4dda22a"} Jan 30 21:33:22 crc kubenswrapper[4914]: I0130 21:33:22.510290 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6019a332-1bf4-40c9-9ed7-6956d8532e9c","Type":"ContainerStarted","Data":"6e032aa8457d61f54d5910e22caf93daf388729eb8ab40a259c70375947cad73"} Jan 30 21:33:22 crc kubenswrapper[4914]: I0130 21:33:22.744884 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-rdzm9" Jan 30 21:33:23 crc kubenswrapper[4914]: I0130 21:33:23.523102 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"1055c6c7bcb759ac61a70c5e727bc3d5484b8ee332be5cba492a85e80e230fcb"} Jan 30 21:33:23 crc kubenswrapper[4914]: I0130 21:33:23.523548 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"83b575f358c7ad650717826404d255f853785db7d971ba04bcbf6a7f33a73789"} Jan 30 21:33:23 crc kubenswrapper[4914]: I0130 21:33:23.523573 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"dd7a43beeab7b3c1db3f47228de1b3f25be391a873c9c4c4688797d9b11f9409"} Jan 30 21:33:23 crc kubenswrapper[4914]: I0130 21:33:23.523590 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"6b8f270899002254e021e53cc83c268aec7dae72ec676b41d9ab900495881d9b"} Jan 30 21:33:23 crc kubenswrapper[4914]: I0130 21:33:23.881094 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.006677 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-log-ovn\") pod \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.006761 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-run\") pod \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.006769 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b65ca7e8-9e11-447c-8617-6d9e0fe5e771" (UID: "b65ca7e8-9e11-447c-8617-6d9e0fe5e771"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.006805 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-scripts\") pod \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.006845 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsxpr\" (UniqueName: \"kubernetes.io/projected/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-kube-api-access-fsxpr\") pod \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.006828 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-run" (OuterVolumeSpecName: "var-run") pod "b65ca7e8-9e11-447c-8617-6d9e0fe5e771" (UID: "b65ca7e8-9e11-447c-8617-6d9e0fe5e771"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.006987 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-additional-scripts\") pod \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.007027 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-run-ovn\") pod \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\" (UID: \"b65ca7e8-9e11-447c-8617-6d9e0fe5e771\") " Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.007158 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b65ca7e8-9e11-447c-8617-6d9e0fe5e771" (UID: "b65ca7e8-9e11-447c-8617-6d9e0fe5e771"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.007496 4914 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.007520 4914 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.007532 4914 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.007620 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b65ca7e8-9e11-447c-8617-6d9e0fe5e771" (UID: "b65ca7e8-9e11-447c-8617-6d9e0fe5e771"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.007830 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-scripts" (OuterVolumeSpecName: "scripts") pod "b65ca7e8-9e11-447c-8617-6d9e0fe5e771" (UID: "b65ca7e8-9e11-447c-8617-6d9e0fe5e771"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.013330 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-kube-api-access-fsxpr" (OuterVolumeSpecName: "kube-api-access-fsxpr") pod "b65ca7e8-9e11-447c-8617-6d9e0fe5e771" (UID: "b65ca7e8-9e11-447c-8617-6d9e0fe5e771"). InnerVolumeSpecName "kube-api-access-fsxpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.108954 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.108994 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsxpr\" (UniqueName: \"kubernetes.io/projected/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-kube-api-access-fsxpr\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.109007 4914 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b65ca7e8-9e11-447c-8617-6d9e0fe5e771-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.215876 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rdzm9-config-l564h"] Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.228116 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rdzm9-config-l564h"] Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.348807 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.535595 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="946b51b52d22030b326fcd42eb388cd93ee852866e92ee65599ddeebc3f041f9" Jan 30 21:33:24 crc kubenswrapper[4914]: I0130 21:33:24.535781 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rdzm9-config-l564h" Jan 30 21:33:25 crc kubenswrapper[4914]: I0130 21:33:25.543227 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6019a332-1bf4-40c9-9ed7-6956d8532e9c","Type":"ContainerStarted","Data":"8e1477de41ba110c2948c8e45ef6cca0d33fb0bbcaf85328218dc2ef436e7a63"} Jan 30 21:33:25 crc kubenswrapper[4914]: I0130 21:33:25.546657 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"d70abd50611c15df6ea41ae850f4400722d4e689b906bdec096384ffce86f912"} Jan 30 21:33:25 crc kubenswrapper[4914]: I0130 21:33:25.546676 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"e7a9e00d6be84ca61220b6fd8e02d04f2f78338d8a1bfcc9186739023e5c28c3"} Jan 30 21:33:25 crc kubenswrapper[4914]: I0130 21:33:25.546685 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"ddc8ecbeddaa22bd5e9f7269945b3d1e03c6f57d41e58d3c846089dc6c01ba02"} Jan 30 21:33:25 crc kubenswrapper[4914]: I0130 21:33:25.845250 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b65ca7e8-9e11-447c-8617-6d9e0fe5e771" path="/var/lib/kubelet/pods/b65ca7e8-9e11-447c-8617-6d9e0fe5e771/volumes" Jan 30 21:33:26 crc kubenswrapper[4914]: I0130 21:33:26.558279 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"88733f8cbecdf41af8c173915d2f7988b17a60827494193039b9e3c74f9d80b2"} Jan 30 21:33:27 crc kubenswrapper[4914]: I0130 21:33:27.578454 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"9cd3728662eefc34737214eb10b16aafe2df77c283e50cd42b6e31ac4de7b3c7"} Jan 30 21:33:27 crc kubenswrapper[4914]: I0130 21:33:27.578864 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"de1decf1a91352f7aa82d44c5d04a2a3f1709306dfdc42678791c1d67b8f369a"} Jan 30 21:33:27 crc kubenswrapper[4914]: I0130 21:33:27.578880 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"6212672ae10f4c8ee2d22be1d36ad69c22e50f007489ef5b72ec6cd95d0dc11d"} Jan 30 21:33:27 crc kubenswrapper[4914]: I0130 21:33:27.712920 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.001177 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.300626 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3497-account-create-update-58wqv"] Jan 30 21:33:28 crc kubenswrapper[4914]: E0130 21:33:28.301233 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65ca7e8-9e11-447c-8617-6d9e0fe5e771" containerName="ovn-config" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.301248 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65ca7e8-9e11-447c-8617-6d9e0fe5e771" containerName="ovn-config" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.301428 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="b65ca7e8-9e11-447c-8617-6d9e0fe5e771" containerName="ovn-config" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.302033 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3497-account-create-update-58wqv" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.306140 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.311079 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-tz86b"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.312225 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tz86b" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.323576 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3497-account-create-update-58wqv"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.351347 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tz86b"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.415901 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6451aee5-930f-41e5-8a8c-8b50c3b3a887-operator-scripts\") pod \"cinder-db-create-tz86b\" (UID: \"6451aee5-930f-41e5-8a8c-8b50c3b3a887\") " pod="openstack/cinder-db-create-tz86b" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.416338 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsj5b\" (UniqueName: \"kubernetes.io/projected/580c0e5e-259e-4164-ac77-ca625b915ffa-kube-api-access-bsj5b\") pod \"barbican-3497-account-create-update-58wqv\" (UID: \"580c0e5e-259e-4164-ac77-ca625b915ffa\") " pod="openstack/barbican-3497-account-create-update-58wqv" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.416366 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/580c0e5e-259e-4164-ac77-ca625b915ffa-operator-scripts\") pod \"barbican-3497-account-create-update-58wqv\" (UID: \"580c0e5e-259e-4164-ac77-ca625b915ffa\") " pod="openstack/barbican-3497-account-create-update-58wqv" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.416422 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtt8p\" (UniqueName: \"kubernetes.io/projected/6451aee5-930f-41e5-8a8c-8b50c3b3a887-kube-api-access-gtt8p\") pod \"cinder-db-create-tz86b\" (UID: \"6451aee5-930f-41e5-8a8c-8b50c3b3a887\") " pod="openstack/cinder-db-create-tz86b" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.436528 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-8w2bj"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.438026 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8w2bj" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.459301 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-665f-account-create-update-bgn68"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.460465 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-665f-account-create-update-bgn68" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.463628 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.473766 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8w2bj"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.485406 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-665f-account-create-update-bgn68"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.513206 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-k9mzs"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.514358 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k9mzs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.517372 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.518234 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.518875 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tzbth" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.522958 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.525176 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6451aee5-930f-41e5-8a8c-8b50c3b3a887-operator-scripts\") pod \"cinder-db-create-tz86b\" (UID: \"6451aee5-930f-41e5-8a8c-8b50c3b3a887\") " pod="openstack/cinder-db-create-tz86b" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.525232 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsj5b\" (UniqueName: \"kubernetes.io/projected/580c0e5e-259e-4164-ac77-ca625b915ffa-kube-api-access-bsj5b\") pod \"barbican-3497-account-create-update-58wqv\" (UID: \"580c0e5e-259e-4164-ac77-ca625b915ffa\") " pod="openstack/barbican-3497-account-create-update-58wqv" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.525253 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/580c0e5e-259e-4164-ac77-ca625b915ffa-operator-scripts\") pod \"barbican-3497-account-create-update-58wqv\" (UID: \"580c0e5e-259e-4164-ac77-ca625b915ffa\") " pod="openstack/barbican-3497-account-create-update-58wqv" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.525300 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtt8p\" (UniqueName: \"kubernetes.io/projected/6451aee5-930f-41e5-8a8c-8b50c3b3a887-kube-api-access-gtt8p\") pod \"cinder-db-create-tz86b\" (UID: \"6451aee5-930f-41e5-8a8c-8b50c3b3a887\") " pod="openstack/cinder-db-create-tz86b" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.526349 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6451aee5-930f-41e5-8a8c-8b50c3b3a887-operator-scripts\") pod \"cinder-db-create-tz86b\" (UID: \"6451aee5-930f-41e5-8a8c-8b50c3b3a887\") " pod="openstack/cinder-db-create-tz86b" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.529827 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/580c0e5e-259e-4164-ac77-ca625b915ffa-operator-scripts\") pod \"barbican-3497-account-create-update-58wqv\" (UID: \"580c0e5e-259e-4164-ac77-ca625b915ffa\") " pod="openstack/barbican-3497-account-create-update-58wqv" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.537910 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-g6c92"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.546143 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-g6c92" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.557414 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-k9mzs"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.573148 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtt8p\" (UniqueName: \"kubernetes.io/projected/6451aee5-930f-41e5-8a8c-8b50c3b3a887-kube-api-access-gtt8p\") pod \"cinder-db-create-tz86b\" (UID: \"6451aee5-930f-41e5-8a8c-8b50c3b3a887\") " pod="openstack/cinder-db-create-tz86b" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.579155 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsj5b\" (UniqueName: \"kubernetes.io/projected/580c0e5e-259e-4164-ac77-ca625b915ffa-kube-api-access-bsj5b\") pod \"barbican-3497-account-create-update-58wqv\" (UID: \"580c0e5e-259e-4164-ac77-ca625b915ffa\") " pod="openstack/barbican-3497-account-create-update-58wqv" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.597507 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-g6c92"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.614827 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"8272fd66aea2dc8e4f0788cd690df8853efdecd5350c8d3da93b15b3b5e91d17"} Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.615072 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"df6de98613e611c3ef0be1c0c2db4587061ee444fc217379b1543aac61624292"} Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.623640 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3497-account-create-update-58wqv" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.627081 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001ceb92-59d0-496f-8331-51d4e131b419-operator-scripts\") pod \"cinder-665f-account-create-update-bgn68\" (UID: \"001ceb92-59d0-496f-8331-51d4e131b419\") " pod="openstack/cinder-665f-account-create-update-bgn68" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.627144 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q6dg\" (UniqueName: \"kubernetes.io/projected/001ceb92-59d0-496f-8331-51d4e131b419-kube-api-access-6q6dg\") pod \"cinder-665f-account-create-update-bgn68\" (UID: \"001ceb92-59d0-496f-8331-51d4e131b419\") " pod="openstack/cinder-665f-account-create-update-bgn68" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.627199 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp5gl\" (UniqueName: \"kubernetes.io/projected/66ea5989-2da6-4da0-adb5-91da8e9e2779-kube-api-access-bp5gl\") pod \"keystone-db-sync-k9mzs\" (UID: \"66ea5989-2da6-4da0-adb5-91da8e9e2779\") " pod="openstack/keystone-db-sync-k9mzs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.627228 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66ea5989-2da6-4da0-adb5-91da8e9e2779-config-data\") pod \"keystone-db-sync-k9mzs\" (UID: \"66ea5989-2da6-4da0-adb5-91da8e9e2779\") " pod="openstack/keystone-db-sync-k9mzs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.627244 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ea5989-2da6-4da0-adb5-91da8e9e2779-combined-ca-bundle\") pod \"keystone-db-sync-k9mzs\" (UID: \"66ea5989-2da6-4da0-adb5-91da8e9e2779\") " pod="openstack/keystone-db-sync-k9mzs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.627293 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdhmq\" (UniqueName: \"kubernetes.io/projected/fb47ba3a-cc1d-48fb-9974-0d30deca719b-kube-api-access-xdhmq\") pod \"barbican-db-create-8w2bj\" (UID: \"fb47ba3a-cc1d-48fb-9974-0d30deca719b\") " pod="openstack/barbican-db-create-8w2bj" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.627323 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb47ba3a-cc1d-48fb-9974-0d30deca719b-operator-scripts\") pod \"barbican-db-create-8w2bj\" (UID: \"fb47ba3a-cc1d-48fb-9974-0d30deca719b\") " pod="openstack/barbican-db-create-8w2bj" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.632179 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f2e8-account-create-update-tnvvs"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.633482 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f2e8-account-create-update-tnvvs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.636303 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.636909 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tz86b" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.652289 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f2e8-account-create-update-tnvvs"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.729759 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp497\" (UniqueName: \"kubernetes.io/projected/83c5fce8-6d2b-47ae-ba0d-fbbc545de111-kube-api-access-rp497\") pod \"cloudkitty-db-create-g6c92\" (UID: \"83c5fce8-6d2b-47ae-ba0d-fbbc545de111\") " pod="openstack/cloudkitty-db-create-g6c92" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.730043 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp5gl\" (UniqueName: \"kubernetes.io/projected/66ea5989-2da6-4da0-adb5-91da8e9e2779-kube-api-access-bp5gl\") pod \"keystone-db-sync-k9mzs\" (UID: \"66ea5989-2da6-4da0-adb5-91da8e9e2779\") " pod="openstack/keystone-db-sync-k9mzs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.730073 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66ea5989-2da6-4da0-adb5-91da8e9e2779-config-data\") pod \"keystone-db-sync-k9mzs\" (UID: \"66ea5989-2da6-4da0-adb5-91da8e9e2779\") " pod="openstack/keystone-db-sync-k9mzs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.730090 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ea5989-2da6-4da0-adb5-91da8e9e2779-combined-ca-bundle\") pod \"keystone-db-sync-k9mzs\" (UID: \"66ea5989-2da6-4da0-adb5-91da8e9e2779\") " pod="openstack/keystone-db-sync-k9mzs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.730146 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdhmq\" (UniqueName: \"kubernetes.io/projected/fb47ba3a-cc1d-48fb-9974-0d30deca719b-kube-api-access-xdhmq\") pod \"barbican-db-create-8w2bj\" (UID: \"fb47ba3a-cc1d-48fb-9974-0d30deca719b\") " pod="openstack/barbican-db-create-8w2bj" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.730177 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb47ba3a-cc1d-48fb-9974-0d30deca719b-operator-scripts\") pod \"barbican-db-create-8w2bj\" (UID: \"fb47ba3a-cc1d-48fb-9974-0d30deca719b\") " pod="openstack/barbican-db-create-8w2bj" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.730204 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83c5fce8-6d2b-47ae-ba0d-fbbc545de111-operator-scripts\") pod \"cloudkitty-db-create-g6c92\" (UID: \"83c5fce8-6d2b-47ae-ba0d-fbbc545de111\") " pod="openstack/cloudkitty-db-create-g6c92" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.730241 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001ceb92-59d0-496f-8331-51d4e131b419-operator-scripts\") pod \"cinder-665f-account-create-update-bgn68\" (UID: \"001ceb92-59d0-496f-8331-51d4e131b419\") " pod="openstack/cinder-665f-account-create-update-bgn68" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.730277 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q6dg\" (UniqueName: \"kubernetes.io/projected/001ceb92-59d0-496f-8331-51d4e131b419-kube-api-access-6q6dg\") pod \"cinder-665f-account-create-update-bgn68\" (UID: \"001ceb92-59d0-496f-8331-51d4e131b419\") " pod="openstack/cinder-665f-account-create-update-bgn68" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.732968 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb47ba3a-cc1d-48fb-9974-0d30deca719b-operator-scripts\") pod \"barbican-db-create-8w2bj\" (UID: \"fb47ba3a-cc1d-48fb-9974-0d30deca719b\") " pod="openstack/barbican-db-create-8w2bj" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.733169 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001ceb92-59d0-496f-8331-51d4e131b419-operator-scripts\") pod \"cinder-665f-account-create-update-bgn68\" (UID: \"001ceb92-59d0-496f-8331-51d4e131b419\") " pod="openstack/cinder-665f-account-create-update-bgn68" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.743807 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66ea5989-2da6-4da0-adb5-91da8e9e2779-config-data\") pod \"keystone-db-sync-k9mzs\" (UID: \"66ea5989-2da6-4da0-adb5-91da8e9e2779\") " pod="openstack/keystone-db-sync-k9mzs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.748264 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp5gl\" (UniqueName: \"kubernetes.io/projected/66ea5989-2da6-4da0-adb5-91da8e9e2779-kube-api-access-bp5gl\") pod \"keystone-db-sync-k9mzs\" (UID: \"66ea5989-2da6-4da0-adb5-91da8e9e2779\") " pod="openstack/keystone-db-sync-k9mzs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.753887 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ea5989-2da6-4da0-adb5-91da8e9e2779-combined-ca-bundle\") pod \"keystone-db-sync-k9mzs\" (UID: \"66ea5989-2da6-4da0-adb5-91da8e9e2779\") " pod="openstack/keystone-db-sync-k9mzs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.757190 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q6dg\" (UniqueName: \"kubernetes.io/projected/001ceb92-59d0-496f-8331-51d4e131b419-kube-api-access-6q6dg\") pod \"cinder-665f-account-create-update-bgn68\" (UID: \"001ceb92-59d0-496f-8331-51d4e131b419\") " pod="openstack/cinder-665f-account-create-update-bgn68" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.763493 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdhmq\" (UniqueName: \"kubernetes.io/projected/fb47ba3a-cc1d-48fb-9974-0d30deca719b-kube-api-access-xdhmq\") pod \"barbican-db-create-8w2bj\" (UID: \"fb47ba3a-cc1d-48fb-9974-0d30deca719b\") " pod="openstack/barbican-db-create-8w2bj" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.793560 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-665f-account-create-update-bgn68" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.831752 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp497\" (UniqueName: \"kubernetes.io/projected/83c5fce8-6d2b-47ae-ba0d-fbbc545de111-kube-api-access-rp497\") pod \"cloudkitty-db-create-g6c92\" (UID: \"83c5fce8-6d2b-47ae-ba0d-fbbc545de111\") " pod="openstack/cloudkitty-db-create-g6c92" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.831835 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/141de14b-d8ba-45b3-96de-efe388b9fc35-operator-scripts\") pod \"neutron-f2e8-account-create-update-tnvvs\" (UID: \"141de14b-d8ba-45b3-96de-efe388b9fc35\") " pod="openstack/neutron-f2e8-account-create-update-tnvvs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.831873 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h777z\" (UniqueName: \"kubernetes.io/projected/141de14b-d8ba-45b3-96de-efe388b9fc35-kube-api-access-h777z\") pod \"neutron-f2e8-account-create-update-tnvvs\" (UID: \"141de14b-d8ba-45b3-96de-efe388b9fc35\") " pod="openstack/neutron-f2e8-account-create-update-tnvvs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.831894 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83c5fce8-6d2b-47ae-ba0d-fbbc545de111-operator-scripts\") pod \"cloudkitty-db-create-g6c92\" (UID: \"83c5fce8-6d2b-47ae-ba0d-fbbc545de111\") " pod="openstack/cloudkitty-db-create-g6c92" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.832520 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83c5fce8-6d2b-47ae-ba0d-fbbc545de111-operator-scripts\") pod \"cloudkitty-db-create-g6c92\" (UID: \"83c5fce8-6d2b-47ae-ba0d-fbbc545de111\") " pod="openstack/cloudkitty-db-create-g6c92" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.832813 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-4158-account-create-update-hwrhs"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.836256 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-4158-account-create-update-hwrhs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.844657 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.847354 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k9mzs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.848759 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-m7dsr"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.850014 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-m7dsr" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.865929 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-m7dsr"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.887545 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp497\" (UniqueName: \"kubernetes.io/projected/83c5fce8-6d2b-47ae-ba0d-fbbc545de111-kube-api-access-rp497\") pod \"cloudkitty-db-create-g6c92\" (UID: \"83c5fce8-6d2b-47ae-ba0d-fbbc545de111\") " pod="openstack/cloudkitty-db-create-g6c92" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.931050 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-4158-account-create-update-hwrhs"] Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.933179 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/141de14b-d8ba-45b3-96de-efe388b9fc35-operator-scripts\") pod \"neutron-f2e8-account-create-update-tnvvs\" (UID: \"141de14b-d8ba-45b3-96de-efe388b9fc35\") " pod="openstack/neutron-f2e8-account-create-update-tnvvs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.933237 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h777z\" (UniqueName: \"kubernetes.io/projected/141de14b-d8ba-45b3-96de-efe388b9fc35-kube-api-access-h777z\") pod \"neutron-f2e8-account-create-update-tnvvs\" (UID: \"141de14b-d8ba-45b3-96de-efe388b9fc35\") " pod="openstack/neutron-f2e8-account-create-update-tnvvs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.934257 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/141de14b-d8ba-45b3-96de-efe388b9fc35-operator-scripts\") pod \"neutron-f2e8-account-create-update-tnvvs\" (UID: \"141de14b-d8ba-45b3-96de-efe388b9fc35\") " pod="openstack/neutron-f2e8-account-create-update-tnvvs" Jan 30 21:33:28 crc kubenswrapper[4914]: I0130 21:33:28.955054 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h777z\" (UniqueName: \"kubernetes.io/projected/141de14b-d8ba-45b3-96de-efe388b9fc35-kube-api-access-h777z\") pod \"neutron-f2e8-account-create-update-tnvvs\" (UID: \"141de14b-d8ba-45b3-96de-efe388b9fc35\") " pod="openstack/neutron-f2e8-account-create-update-tnvvs" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.034594 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cc7a16-5905-4608-9212-2440d7235a11-operator-scripts\") pod \"neutron-db-create-m7dsr\" (UID: \"a9cc7a16-5905-4608-9212-2440d7235a11\") " pod="openstack/neutron-db-create-m7dsr" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.034647 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5jqv\" (UniqueName: \"kubernetes.io/projected/562c29cc-4430-4d60-9577-92ff7849353f-kube-api-access-w5jqv\") pod \"cloudkitty-4158-account-create-update-hwrhs\" (UID: \"562c29cc-4430-4d60-9577-92ff7849353f\") " pod="openstack/cloudkitty-4158-account-create-update-hwrhs" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.034671 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mp4c\" (UniqueName: \"kubernetes.io/projected/a9cc7a16-5905-4608-9212-2440d7235a11-kube-api-access-9mp4c\") pod \"neutron-db-create-m7dsr\" (UID: \"a9cc7a16-5905-4608-9212-2440d7235a11\") " pod="openstack/neutron-db-create-m7dsr" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.034742 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/562c29cc-4430-4d60-9577-92ff7849353f-operator-scripts\") pod \"cloudkitty-4158-account-create-update-hwrhs\" (UID: \"562c29cc-4430-4d60-9577-92ff7849353f\") " pod="openstack/cloudkitty-4158-account-create-update-hwrhs" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.056225 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8w2bj" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.137877 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/562c29cc-4430-4d60-9577-92ff7849353f-operator-scripts\") pod \"cloudkitty-4158-account-create-update-hwrhs\" (UID: \"562c29cc-4430-4d60-9577-92ff7849353f\") " pod="openstack/cloudkitty-4158-account-create-update-hwrhs" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.138050 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cc7a16-5905-4608-9212-2440d7235a11-operator-scripts\") pod \"neutron-db-create-m7dsr\" (UID: \"a9cc7a16-5905-4608-9212-2440d7235a11\") " pod="openstack/neutron-db-create-m7dsr" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.138099 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5jqv\" (UniqueName: \"kubernetes.io/projected/562c29cc-4430-4d60-9577-92ff7849353f-kube-api-access-w5jqv\") pod \"cloudkitty-4158-account-create-update-hwrhs\" (UID: \"562c29cc-4430-4d60-9577-92ff7849353f\") " pod="openstack/cloudkitty-4158-account-create-update-hwrhs" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.138126 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mp4c\" (UniqueName: \"kubernetes.io/projected/a9cc7a16-5905-4608-9212-2440d7235a11-kube-api-access-9mp4c\") pod \"neutron-db-create-m7dsr\" (UID: \"a9cc7a16-5905-4608-9212-2440d7235a11\") " pod="openstack/neutron-db-create-m7dsr" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.139587 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/562c29cc-4430-4d60-9577-92ff7849353f-operator-scripts\") pod \"cloudkitty-4158-account-create-update-hwrhs\" (UID: \"562c29cc-4430-4d60-9577-92ff7849353f\") " pod="openstack/cloudkitty-4158-account-create-update-hwrhs" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.139837 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cc7a16-5905-4608-9212-2440d7235a11-operator-scripts\") pod \"neutron-db-create-m7dsr\" (UID: \"a9cc7a16-5905-4608-9212-2440d7235a11\") " pod="openstack/neutron-db-create-m7dsr" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.170183 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-g6c92" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.172815 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mp4c\" (UniqueName: \"kubernetes.io/projected/a9cc7a16-5905-4608-9212-2440d7235a11-kube-api-access-9mp4c\") pod \"neutron-db-create-m7dsr\" (UID: \"a9cc7a16-5905-4608-9212-2440d7235a11\") " pod="openstack/neutron-db-create-m7dsr" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.174434 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5jqv\" (UniqueName: \"kubernetes.io/projected/562c29cc-4430-4d60-9577-92ff7849353f-kube-api-access-w5jqv\") pod \"cloudkitty-4158-account-create-update-hwrhs\" (UID: \"562c29cc-4430-4d60-9577-92ff7849353f\") " pod="openstack/cloudkitty-4158-account-create-update-hwrhs" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.250814 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f2e8-account-create-update-tnvvs" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.265538 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tz86b"] Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.267045 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-4158-account-create-update-hwrhs" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.267478 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-m7dsr" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.362377 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3497-account-create-update-58wqv"] Jan 30 21:33:29 crc kubenswrapper[4914]: W0130 21:33:29.401026 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod580c0e5e_259e_4164_ac77_ca625b915ffa.slice/crio-6fd23d87ef4d7cfe792b20bfd5b33b4a5e38a71db19823109a9aab3663025107 WatchSource:0}: Error finding container 6fd23d87ef4d7cfe792b20bfd5b33b4a5e38a71db19823109a9aab3663025107: Status 404 returned error can't find the container with id 6fd23d87ef4d7cfe792b20bfd5b33b4a5e38a71db19823109a9aab3663025107 Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.494751 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-665f-account-create-update-bgn68"] Jan 30 21:33:29 crc kubenswrapper[4914]: W0130 21:33:29.522616 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001ceb92_59d0_496f_8331_51d4e131b419.slice/crio-2aea15859a3591654b5c5e19d74636c25e0a835d6c68a7cd8366f0c44c49baa9 WatchSource:0}: Error finding container 2aea15859a3591654b5c5e19d74636c25e0a835d6c68a7cd8366f0c44c49baa9: Status 404 returned error can't find the container with id 2aea15859a3591654b5c5e19d74636c25e0a835d6c68a7cd8366f0c44c49baa9 Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.584026 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-k9mzs"] Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.616868 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8w2bj"] Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.646895 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-g6c92"] Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.654228 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-665f-account-create-update-bgn68" event={"ID":"001ceb92-59d0-496f-8331-51d4e131b419","Type":"ContainerStarted","Data":"2aea15859a3591654b5c5e19d74636c25e0a835d6c68a7cd8366f0c44c49baa9"} Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.676480 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3497-account-create-update-58wqv" event={"ID":"580c0e5e-259e-4164-ac77-ca625b915ffa","Type":"ContainerStarted","Data":"6fd23d87ef4d7cfe792b20bfd5b33b4a5e38a71db19823109a9aab3663025107"} Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.710404 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tz86b" event={"ID":"6451aee5-930f-41e5-8a8c-8b50c3b3a887","Type":"ContainerStarted","Data":"de526fbb49d8de0c3f022f984ad2c24bb0479a806df4e12f22934a95c80c80d2"} Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.760501 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"b96358756117d3514d64c41052d571568f697664e690b468bee249953710333b"} Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.760542 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a754950-b587-4c0a-85ed-e9669582ea2c","Type":"ContainerStarted","Data":"efca643765f9e8d326dc9629a3f0b58041f783b86c1d2c7ff02f9d5722c842a8"} Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.822522 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=42.403304787 podStartE2EDuration="47.822500622s" podCreationTimestamp="2026-01-30 21:32:42 +0000 UTC" firstStartedPulling="2026-01-30 21:33:21.252274392 +0000 UTC m=+1134.690911153" lastFinishedPulling="2026-01-30 21:33:26.671470227 +0000 UTC m=+1140.110106988" observedRunningTime="2026-01-30 21:33:29.807998143 +0000 UTC m=+1143.246634914" watchObservedRunningTime="2026-01-30 21:33:29.822500622 +0000 UTC m=+1143.261137383" Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.928945 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f2e8-account-create-update-tnvvs"] Jan 30 21:33:29 crc kubenswrapper[4914]: I0130 21:33:29.979346 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-m7dsr"] Jan 30 21:33:29 crc kubenswrapper[4914]: W0130 21:33:29.996472 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9cc7a16_5905_4608_9212_2440d7235a11.slice/crio-5fa14da00f8a827b1ea0164b481251fc5418252c94ac43c83826c557c0672755 WatchSource:0}: Error finding container 5fa14da00f8a827b1ea0164b481251fc5418252c94ac43c83826c557c0672755: Status 404 returned error can't find the container with id 5fa14da00f8a827b1ea0164b481251fc5418252c94ac43c83826c557c0672755 Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.034158 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-4158-account-create-update-hwrhs"] Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.300781 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4dddm"] Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.302237 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.306310 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.316350 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4dddm"] Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.365752 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-config\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.365803 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-dns-svc\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.365839 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.365877 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.365913 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgcfm\" (UniqueName: \"kubernetes.io/projected/73907d8c-bf5d-4654-9a14-ae335ab89b11-kube-api-access-tgcfm\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.366150 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.467640 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.467725 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.467760 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgcfm\" (UniqueName: \"kubernetes.io/projected/73907d8c-bf5d-4654-9a14-ae335ab89b11-kube-api-access-tgcfm\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.467819 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.467911 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-config\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.467936 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-dns-svc\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.469226 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-config\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.469385 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.469438 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-dns-svc\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.469625 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.469974 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.488004 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgcfm\" (UniqueName: \"kubernetes.io/projected/73907d8c-bf5d-4654-9a14-ae335ab89b11-kube-api-access-tgcfm\") pod \"dnsmasq-dns-764c5664d7-4dddm\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.686066 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.797879 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8w2bj" event={"ID":"fb47ba3a-cc1d-48fb-9974-0d30deca719b","Type":"ContainerStarted","Data":"6cf5a0d9b79786c8302d1183ee5ea12812d37a8aff3ec2c26ba49deebb94df85"} Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.797921 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8w2bj" event={"ID":"fb47ba3a-cc1d-48fb-9974-0d30deca719b","Type":"ContainerStarted","Data":"3940193fff5cca0c1b5cd15aae9caab37489b253e40e6d1437d45fbbc041ef61"} Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.810928 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k9mzs" event={"ID":"66ea5989-2da6-4da0-adb5-91da8e9e2779","Type":"ContainerStarted","Data":"ec33a78a8b840ec3cc1114dde571a43cd2c0fb3039486ba01e9c90d78fc4af74"} Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.830656 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-8w2bj" podStartSLOduration=2.83063411 podStartE2EDuration="2.83063411s" podCreationTimestamp="2026-01-30 21:33:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:30.819568004 +0000 UTC m=+1144.258204765" watchObservedRunningTime="2026-01-30 21:33:30.83063411 +0000 UTC m=+1144.269270881" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.838687 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-g6c92" event={"ID":"83c5fce8-6d2b-47ae-ba0d-fbbc545de111","Type":"ContainerStarted","Data":"0d5b5ff41743d65278e9fe8ff021a803ff8b8bbd1562fdbff225c5269d778b54"} Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.838774 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-g6c92" event={"ID":"83c5fce8-6d2b-47ae-ba0d-fbbc545de111","Type":"ContainerStarted","Data":"a7a894b4b739eb885ba8d113d9f21e7b8b6c5254d55f617dd55f37e05756bb3f"} Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.863640 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-m7dsr" event={"ID":"a9cc7a16-5905-4608-9212-2440d7235a11","Type":"ContainerStarted","Data":"f2a09f8166e99d336494e7f0904a70e077388b90920bb1b1fe93e52739dd082c"} Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.864661 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-m7dsr" event={"ID":"a9cc7a16-5905-4608-9212-2440d7235a11","Type":"ContainerStarted","Data":"5fa14da00f8a827b1ea0164b481251fc5418252c94ac43c83826c557c0672755"} Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.877106 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-create-g6c92" podStartSLOduration=2.875418758 podStartE2EDuration="2.875418758s" podCreationTimestamp="2026-01-30 21:33:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:30.863287486 +0000 UTC m=+1144.301924247" watchObservedRunningTime="2026-01-30 21:33:30.875418758 +0000 UTC m=+1144.314055519" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.878742 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-665f-account-create-update-bgn68" event={"ID":"001ceb92-59d0-496f-8331-51d4e131b419","Type":"ContainerStarted","Data":"09f1e288be3ca6b19edd36434bc8983709206768ab1e07db2b5fd23334fb3cde"} Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.887751 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-m7dsr" podStartSLOduration=2.887734695 podStartE2EDuration="2.887734695s" podCreationTimestamp="2026-01-30 21:33:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:30.886457844 +0000 UTC m=+1144.325094605" watchObservedRunningTime="2026-01-30 21:33:30.887734695 +0000 UTC m=+1144.326371456" Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.896471 4914 generic.go:334] "Generic (PLEG): container finished" podID="6019a332-1bf4-40c9-9ed7-6956d8532e9c" containerID="8e1477de41ba110c2948c8e45ef6cca0d33fb0bbcaf85328218dc2ef436e7a63" exitCode=0 Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.896562 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6019a332-1bf4-40c9-9ed7-6956d8532e9c","Type":"ContainerDied","Data":"8e1477de41ba110c2948c8e45ef6cca0d33fb0bbcaf85328218dc2ef436e7a63"} Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.910437 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f2e8-account-create-update-tnvvs" event={"ID":"141de14b-d8ba-45b3-96de-efe388b9fc35","Type":"ContainerStarted","Data":"ba9ef4df2ae722ddcbcf9ebb778a911849eb3f6cda19b59d3fed87a61e483993"} Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.910483 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f2e8-account-create-update-tnvvs" event={"ID":"141de14b-d8ba-45b3-96de-efe388b9fc35","Type":"ContainerStarted","Data":"dd4b9d2b9577812b2910e57718098439fbc9542a8151c1f3f3c60331d032ada7"} Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.921884 4914 generic.go:334] "Generic (PLEG): container finished" podID="580c0e5e-259e-4164-ac77-ca625b915ffa" containerID="c90eaa9a4b87197d0c33698c04e8ea796e9b104f4dd2da6447e0da468c1396e9" exitCode=0 Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.921976 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3497-account-create-update-58wqv" event={"ID":"580c0e5e-259e-4164-ac77-ca625b915ffa","Type":"ContainerDied","Data":"c90eaa9a4b87197d0c33698c04e8ea796e9b104f4dd2da6447e0da468c1396e9"} Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.948253 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-4158-account-create-update-hwrhs" event={"ID":"562c29cc-4430-4d60-9577-92ff7849353f","Type":"ContainerStarted","Data":"d8f73b59ed4e69f6877673635767b5659f77808db83cc3e43a655b8df8e3956c"} Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.948294 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-4158-account-create-update-hwrhs" event={"ID":"562c29cc-4430-4d60-9577-92ff7849353f","Type":"ContainerStarted","Data":"d2fccdfde5581479729e92870d2486c59e55f87998a66180e409bbbb6830efff"} Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.987512 4914 generic.go:334] "Generic (PLEG): container finished" podID="6451aee5-930f-41e5-8a8c-8b50c3b3a887" containerID="2ee04ef2d6c1149023df191f681a0176b24938c76eed9fe58e630af38f3ee8da" exitCode=0 Jan 30 21:33:30 crc kubenswrapper[4914]: I0130 21:33:30.988736 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tz86b" event={"ID":"6451aee5-930f-41e5-8a8c-8b50c3b3a887","Type":"ContainerDied","Data":"2ee04ef2d6c1149023df191f681a0176b24938c76eed9fe58e630af38f3ee8da"} Jan 30 21:33:31 crc kubenswrapper[4914]: I0130 21:33:31.017744 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f2e8-account-create-update-tnvvs" podStartSLOduration=3.017722274 podStartE2EDuration="3.017722274s" podCreationTimestamp="2026-01-30 21:33:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:30.995487538 +0000 UTC m=+1144.434124299" watchObservedRunningTime="2026-01-30 21:33:31.017722274 +0000 UTC m=+1144.456359035" Jan 30 21:33:31 crc kubenswrapper[4914]: I0130 21:33:31.056333 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-4158-account-create-update-hwrhs" podStartSLOduration=3.056315853 podStartE2EDuration="3.056315853s" podCreationTimestamp="2026-01-30 21:33:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:31.048509655 +0000 UTC m=+1144.487146416" watchObservedRunningTime="2026-01-30 21:33:31.056315853 +0000 UTC m=+1144.494952614" Jan 30 21:33:31 crc kubenswrapper[4914]: I0130 21:33:31.298688 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4dddm"] Jan 30 21:33:31 crc kubenswrapper[4914]: W0130 21:33:31.329265 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73907d8c_bf5d_4654_9a14_ae335ab89b11.slice/crio-aec72491671e70eecca16f79bfcdf79633def18a70ac53bddd61de1a3681d5e5 WatchSource:0}: Error finding container aec72491671e70eecca16f79bfcdf79633def18a70ac53bddd61de1a3681d5e5: Status 404 returned error can't find the container with id aec72491671e70eecca16f79bfcdf79633def18a70ac53bddd61de1a3681d5e5 Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.003031 4914 generic.go:334] "Generic (PLEG): container finished" podID="001ceb92-59d0-496f-8331-51d4e131b419" containerID="09f1e288be3ca6b19edd36434bc8983709206768ab1e07db2b5fd23334fb3cde" exitCode=0 Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.003319 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-665f-account-create-update-bgn68" event={"ID":"001ceb92-59d0-496f-8331-51d4e131b419","Type":"ContainerDied","Data":"09f1e288be3ca6b19edd36434bc8983709206768ab1e07db2b5fd23334fb3cde"} Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.010673 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6019a332-1bf4-40c9-9ed7-6956d8532e9c","Type":"ContainerStarted","Data":"8eeee504e8f2b81e9e11565bffb9af64f5f72d322c32f88185a220ca6f4cd18c"} Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.012678 4914 generic.go:334] "Generic (PLEG): container finished" podID="141de14b-d8ba-45b3-96de-efe388b9fc35" containerID="ba9ef4df2ae722ddcbcf9ebb778a911849eb3f6cda19b59d3fed87a61e483993" exitCode=0 Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.012755 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f2e8-account-create-update-tnvvs" event={"ID":"141de14b-d8ba-45b3-96de-efe388b9fc35","Type":"ContainerDied","Data":"ba9ef4df2ae722ddcbcf9ebb778a911849eb3f6cda19b59d3fed87a61e483993"} Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.014564 4914 generic.go:334] "Generic (PLEG): container finished" podID="fb47ba3a-cc1d-48fb-9974-0d30deca719b" containerID="6cf5a0d9b79786c8302d1183ee5ea12812d37a8aff3ec2c26ba49deebb94df85" exitCode=0 Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.014606 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8w2bj" event={"ID":"fb47ba3a-cc1d-48fb-9974-0d30deca719b","Type":"ContainerDied","Data":"6cf5a0d9b79786c8302d1183ee5ea12812d37a8aff3ec2c26ba49deebb94df85"} Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.018943 4914 generic.go:334] "Generic (PLEG): container finished" podID="562c29cc-4430-4d60-9577-92ff7849353f" containerID="d8f73b59ed4e69f6877673635767b5659f77808db83cc3e43a655b8df8e3956c" exitCode=0 Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.018999 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-4158-account-create-update-hwrhs" event={"ID":"562c29cc-4430-4d60-9577-92ff7849353f","Type":"ContainerDied","Data":"d8f73b59ed4e69f6877673635767b5659f77808db83cc3e43a655b8df8e3956c"} Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.026347 4914 generic.go:334] "Generic (PLEG): container finished" podID="73907d8c-bf5d-4654-9a14-ae335ab89b11" containerID="45d0e07816f271342dc79f5313c1813d1d1855c5e28c69c68fcc7e0baccf8325" exitCode=0 Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.026517 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-4dddm" event={"ID":"73907d8c-bf5d-4654-9a14-ae335ab89b11","Type":"ContainerDied","Data":"45d0e07816f271342dc79f5313c1813d1d1855c5e28c69c68fcc7e0baccf8325"} Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.026598 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-4dddm" event={"ID":"73907d8c-bf5d-4654-9a14-ae335ab89b11","Type":"ContainerStarted","Data":"aec72491671e70eecca16f79bfcdf79633def18a70ac53bddd61de1a3681d5e5"} Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.050127 4914 generic.go:334] "Generic (PLEG): container finished" podID="83c5fce8-6d2b-47ae-ba0d-fbbc545de111" containerID="0d5b5ff41743d65278e9fe8ff021a803ff8b8bbd1562fdbff225c5269d778b54" exitCode=0 Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.050232 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-g6c92" event={"ID":"83c5fce8-6d2b-47ae-ba0d-fbbc545de111","Type":"ContainerDied","Data":"0d5b5ff41743d65278e9fe8ff021a803ff8b8bbd1562fdbff225c5269d778b54"} Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.092184 4914 generic.go:334] "Generic (PLEG): container finished" podID="a9cc7a16-5905-4608-9212-2440d7235a11" containerID="f2a09f8166e99d336494e7f0904a70e077388b90920bb1b1fe93e52739dd082c" exitCode=0 Jan 30 21:33:32 crc kubenswrapper[4914]: I0130 21:33:32.092607 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-m7dsr" event={"ID":"a9cc7a16-5905-4608-9212-2440d7235a11","Type":"ContainerDied","Data":"f2a09f8166e99d336494e7f0904a70e077388b90920bb1b1fe93e52739dd082c"} Jan 30 21:33:33 crc kubenswrapper[4914]: I0130 21:33:33.007072 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-665f-account-create-update-bgn68" Jan 30 21:33:33 crc kubenswrapper[4914]: I0130 21:33:33.054231 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q6dg\" (UniqueName: \"kubernetes.io/projected/001ceb92-59d0-496f-8331-51d4e131b419-kube-api-access-6q6dg\") pod \"001ceb92-59d0-496f-8331-51d4e131b419\" (UID: \"001ceb92-59d0-496f-8331-51d4e131b419\") " Jan 30 21:33:33 crc kubenswrapper[4914]: I0130 21:33:33.054437 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001ceb92-59d0-496f-8331-51d4e131b419-operator-scripts\") pod \"001ceb92-59d0-496f-8331-51d4e131b419\" (UID: \"001ceb92-59d0-496f-8331-51d4e131b419\") " Jan 30 21:33:33 crc kubenswrapper[4914]: I0130 21:33:33.057084 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001ceb92-59d0-496f-8331-51d4e131b419-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "001ceb92-59d0-496f-8331-51d4e131b419" (UID: "001ceb92-59d0-496f-8331-51d4e131b419"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:33 crc kubenswrapper[4914]: I0130 21:33:33.072519 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/001ceb92-59d0-496f-8331-51d4e131b419-kube-api-access-6q6dg" (OuterVolumeSpecName: "kube-api-access-6q6dg") pod "001ceb92-59d0-496f-8331-51d4e131b419" (UID: "001ceb92-59d0-496f-8331-51d4e131b419"). InnerVolumeSpecName "kube-api-access-6q6dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:33 crc kubenswrapper[4914]: I0130 21:33:33.114485 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-665f-account-create-update-bgn68" event={"ID":"001ceb92-59d0-496f-8331-51d4e131b419","Type":"ContainerDied","Data":"2aea15859a3591654b5c5e19d74636c25e0a835d6c68a7cd8366f0c44c49baa9"} Jan 30 21:33:33 crc kubenswrapper[4914]: I0130 21:33:33.114517 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aea15859a3591654b5c5e19d74636c25e0a835d6c68a7cd8366f0c44c49baa9" Jan 30 21:33:33 crc kubenswrapper[4914]: I0130 21:33:33.114569 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-665f-account-create-update-bgn68" Jan 30 21:33:33 crc kubenswrapper[4914]: I0130 21:33:33.123291 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-4dddm" event={"ID":"73907d8c-bf5d-4654-9a14-ae335ab89b11","Type":"ContainerStarted","Data":"bc845b9fc95b7ebba1c5ebb7a8c988ae8c6086119916f7e132dd340cc1b9e090"} Jan 30 21:33:33 crc kubenswrapper[4914]: I0130 21:33:33.156361 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001ceb92-59d0-496f-8331-51d4e131b419-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:33 crc kubenswrapper[4914]: I0130 21:33:33.156403 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q6dg\" (UniqueName: \"kubernetes.io/projected/001ceb92-59d0-496f-8331-51d4e131b419-kube-api-access-6q6dg\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.223678 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tz86b" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.236287 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3497-account-create-update-58wqv" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.368634 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsj5b\" (UniqueName: \"kubernetes.io/projected/580c0e5e-259e-4164-ac77-ca625b915ffa-kube-api-access-bsj5b\") pod \"580c0e5e-259e-4164-ac77-ca625b915ffa\" (UID: \"580c0e5e-259e-4164-ac77-ca625b915ffa\") " Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.368737 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/580c0e5e-259e-4164-ac77-ca625b915ffa-operator-scripts\") pod \"580c0e5e-259e-4164-ac77-ca625b915ffa\" (UID: \"580c0e5e-259e-4164-ac77-ca625b915ffa\") " Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.368803 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6451aee5-930f-41e5-8a8c-8b50c3b3a887-operator-scripts\") pod \"6451aee5-930f-41e5-8a8c-8b50c3b3a887\" (UID: \"6451aee5-930f-41e5-8a8c-8b50c3b3a887\") " Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.368891 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtt8p\" (UniqueName: \"kubernetes.io/projected/6451aee5-930f-41e5-8a8c-8b50c3b3a887-kube-api-access-gtt8p\") pod \"6451aee5-930f-41e5-8a8c-8b50c3b3a887\" (UID: \"6451aee5-930f-41e5-8a8c-8b50c3b3a887\") " Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.369289 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/580c0e5e-259e-4164-ac77-ca625b915ffa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "580c0e5e-259e-4164-ac77-ca625b915ffa" (UID: "580c0e5e-259e-4164-ac77-ca625b915ffa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.369739 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6451aee5-930f-41e5-8a8c-8b50c3b3a887-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6451aee5-930f-41e5-8a8c-8b50c3b3a887" (UID: "6451aee5-930f-41e5-8a8c-8b50c3b3a887"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.373717 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/580c0e5e-259e-4164-ac77-ca625b915ffa-kube-api-access-bsj5b" (OuterVolumeSpecName: "kube-api-access-bsj5b") pod "580c0e5e-259e-4164-ac77-ca625b915ffa" (UID: "580c0e5e-259e-4164-ac77-ca625b915ffa"). InnerVolumeSpecName "kube-api-access-bsj5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.374437 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6451aee5-930f-41e5-8a8c-8b50c3b3a887-kube-api-access-gtt8p" (OuterVolumeSpecName: "kube-api-access-gtt8p") pod "6451aee5-930f-41e5-8a8c-8b50c3b3a887" (UID: "6451aee5-930f-41e5-8a8c-8b50c3b3a887"). InnerVolumeSpecName "kube-api-access-gtt8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.447771 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-4158-account-create-update-hwrhs" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.471084 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/580c0e5e-259e-4164-ac77-ca625b915ffa-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.471366 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6451aee5-930f-41e5-8a8c-8b50c3b3a887-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.471389 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtt8p\" (UniqueName: \"kubernetes.io/projected/6451aee5-930f-41e5-8a8c-8b50c3b3a887-kube-api-access-gtt8p\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.471408 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsj5b\" (UniqueName: \"kubernetes.io/projected/580c0e5e-259e-4164-ac77-ca625b915ffa-kube-api-access-bsj5b\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.572952 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/562c29cc-4430-4d60-9577-92ff7849353f-operator-scripts\") pod \"562c29cc-4430-4d60-9577-92ff7849353f\" (UID: \"562c29cc-4430-4d60-9577-92ff7849353f\") " Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.573108 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5jqv\" (UniqueName: \"kubernetes.io/projected/562c29cc-4430-4d60-9577-92ff7849353f-kube-api-access-w5jqv\") pod \"562c29cc-4430-4d60-9577-92ff7849353f\" (UID: \"562c29cc-4430-4d60-9577-92ff7849353f\") " Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.573406 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/562c29cc-4430-4d60-9577-92ff7849353f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "562c29cc-4430-4d60-9577-92ff7849353f" (UID: "562c29cc-4430-4d60-9577-92ff7849353f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.573904 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/562c29cc-4430-4d60-9577-92ff7849353f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.576415 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/562c29cc-4430-4d60-9577-92ff7849353f-kube-api-access-w5jqv" (OuterVolumeSpecName: "kube-api-access-w5jqv") pod "562c29cc-4430-4d60-9577-92ff7849353f" (UID: "562c29cc-4430-4d60-9577-92ff7849353f"). InnerVolumeSpecName "kube-api-access-w5jqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:33.676606 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5jqv\" (UniqueName: \"kubernetes.io/projected/562c29cc-4430-4d60-9577-92ff7849353f-kube-api-access-w5jqv\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:34.141270 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3497-account-create-update-58wqv" event={"ID":"580c0e5e-259e-4164-ac77-ca625b915ffa","Type":"ContainerDied","Data":"6fd23d87ef4d7cfe792b20bfd5b33b4a5e38a71db19823109a9aab3663025107"} Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:34.141304 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fd23d87ef4d7cfe792b20bfd5b33b4a5e38a71db19823109a9aab3663025107" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:34.141359 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3497-account-create-update-58wqv" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:34.143653 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tz86b" event={"ID":"6451aee5-930f-41e5-8a8c-8b50c3b3a887","Type":"ContainerDied","Data":"de526fbb49d8de0c3f022f984ad2c24bb0479a806df4e12f22934a95c80c80d2"} Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:34.143670 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de526fbb49d8de0c3f022f984ad2c24bb0479a806df4e12f22934a95c80c80d2" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:34.143699 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tz86b" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:34.169930 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-4158-account-create-update-hwrhs" event={"ID":"562c29cc-4430-4d60-9577-92ff7849353f","Type":"ContainerDied","Data":"d2fccdfde5581479729e92870d2486c59e55f87998a66180e409bbbb6830efff"} Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:34.169976 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-4158-account-create-update-hwrhs" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:34.170002 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2fccdfde5581479729e92870d2486c59e55f87998a66180e409bbbb6830efff" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:34.171052 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:34.269422 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-4dddm" podStartSLOduration=4.26935933 podStartE2EDuration="4.26935933s" podCreationTimestamp="2026-01-30 21:33:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:34.190930762 +0000 UTC m=+1147.629567533" watchObservedRunningTime="2026-01-30 21:33:34.26935933 +0000 UTC m=+1147.707996131" Jan 30 21:33:36 crc kubenswrapper[4914]: I0130 21:33:35.184139 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6019a332-1bf4-40c9-9ed7-6956d8532e9c","Type":"ContainerStarted","Data":"b2b10f96820c441b177723557ab0da448028a6aece61825fe8cecce5b0995bee"} Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.629645 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-m7dsr" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.635745 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8w2bj" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.644991 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-g6c92" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.699340 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f2e8-account-create-update-tnvvs" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.776655 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cc7a16-5905-4608-9212-2440d7235a11-operator-scripts\") pod \"a9cc7a16-5905-4608-9212-2440d7235a11\" (UID: \"a9cc7a16-5905-4608-9212-2440d7235a11\") " Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.776756 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdhmq\" (UniqueName: \"kubernetes.io/projected/fb47ba3a-cc1d-48fb-9974-0d30deca719b-kube-api-access-xdhmq\") pod \"fb47ba3a-cc1d-48fb-9974-0d30deca719b\" (UID: \"fb47ba3a-cc1d-48fb-9974-0d30deca719b\") " Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.776837 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83c5fce8-6d2b-47ae-ba0d-fbbc545de111-operator-scripts\") pod \"83c5fce8-6d2b-47ae-ba0d-fbbc545de111\" (UID: \"83c5fce8-6d2b-47ae-ba0d-fbbc545de111\") " Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.776861 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/141de14b-d8ba-45b3-96de-efe388b9fc35-operator-scripts\") pod \"141de14b-d8ba-45b3-96de-efe388b9fc35\" (UID: \"141de14b-d8ba-45b3-96de-efe388b9fc35\") " Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.776944 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb47ba3a-cc1d-48fb-9974-0d30deca719b-operator-scripts\") pod \"fb47ba3a-cc1d-48fb-9974-0d30deca719b\" (UID: \"fb47ba3a-cc1d-48fb-9974-0d30deca719b\") " Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.776970 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mp4c\" (UniqueName: \"kubernetes.io/projected/a9cc7a16-5905-4608-9212-2440d7235a11-kube-api-access-9mp4c\") pod \"a9cc7a16-5905-4608-9212-2440d7235a11\" (UID: \"a9cc7a16-5905-4608-9212-2440d7235a11\") " Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.777011 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp497\" (UniqueName: \"kubernetes.io/projected/83c5fce8-6d2b-47ae-ba0d-fbbc545de111-kube-api-access-rp497\") pod \"83c5fce8-6d2b-47ae-ba0d-fbbc545de111\" (UID: \"83c5fce8-6d2b-47ae-ba0d-fbbc545de111\") " Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.777124 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h777z\" (UniqueName: \"kubernetes.io/projected/141de14b-d8ba-45b3-96de-efe388b9fc35-kube-api-access-h777z\") pod \"141de14b-d8ba-45b3-96de-efe388b9fc35\" (UID: \"141de14b-d8ba-45b3-96de-efe388b9fc35\") " Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.777290 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83c5fce8-6d2b-47ae-ba0d-fbbc545de111-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83c5fce8-6d2b-47ae-ba0d-fbbc545de111" (UID: "83c5fce8-6d2b-47ae-ba0d-fbbc545de111"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.777567 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb47ba3a-cc1d-48fb-9974-0d30deca719b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb47ba3a-cc1d-48fb-9974-0d30deca719b" (UID: "fb47ba3a-cc1d-48fb-9974-0d30deca719b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.777615 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83c5fce8-6d2b-47ae-ba0d-fbbc545de111-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.777678 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9cc7a16-5905-4608-9212-2440d7235a11-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a9cc7a16-5905-4608-9212-2440d7235a11" (UID: "a9cc7a16-5905-4608-9212-2440d7235a11"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.777818 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/141de14b-d8ba-45b3-96de-efe388b9fc35-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "141de14b-d8ba-45b3-96de-efe388b9fc35" (UID: "141de14b-d8ba-45b3-96de-efe388b9fc35"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.786499 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9cc7a16-5905-4608-9212-2440d7235a11-kube-api-access-9mp4c" (OuterVolumeSpecName: "kube-api-access-9mp4c") pod "a9cc7a16-5905-4608-9212-2440d7235a11" (UID: "a9cc7a16-5905-4608-9212-2440d7235a11"). InnerVolumeSpecName "kube-api-access-9mp4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.786553 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb47ba3a-cc1d-48fb-9974-0d30deca719b-kube-api-access-xdhmq" (OuterVolumeSpecName: "kube-api-access-xdhmq") pod "fb47ba3a-cc1d-48fb-9974-0d30deca719b" (UID: "fb47ba3a-cc1d-48fb-9974-0d30deca719b"). InnerVolumeSpecName "kube-api-access-xdhmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.786582 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83c5fce8-6d2b-47ae-ba0d-fbbc545de111-kube-api-access-rp497" (OuterVolumeSpecName: "kube-api-access-rp497") pod "83c5fce8-6d2b-47ae-ba0d-fbbc545de111" (UID: "83c5fce8-6d2b-47ae-ba0d-fbbc545de111"). InnerVolumeSpecName "kube-api-access-rp497". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.800668 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/141de14b-d8ba-45b3-96de-efe388b9fc35-kube-api-access-h777z" (OuterVolumeSpecName: "kube-api-access-h777z") pod "141de14b-d8ba-45b3-96de-efe388b9fc35" (UID: "141de14b-d8ba-45b3-96de-efe388b9fc35"). InnerVolumeSpecName "kube-api-access-h777z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.879178 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb47ba3a-cc1d-48fb-9974-0d30deca719b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.879502 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mp4c\" (UniqueName: \"kubernetes.io/projected/a9cc7a16-5905-4608-9212-2440d7235a11-kube-api-access-9mp4c\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.879562 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp497\" (UniqueName: \"kubernetes.io/projected/83c5fce8-6d2b-47ae-ba0d-fbbc545de111-kube-api-access-rp497\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.879611 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h777z\" (UniqueName: \"kubernetes.io/projected/141de14b-d8ba-45b3-96de-efe388b9fc35-kube-api-access-h777z\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.879656 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cc7a16-5905-4608-9212-2440d7235a11-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.879717 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdhmq\" (UniqueName: \"kubernetes.io/projected/fb47ba3a-cc1d-48fb-9974-0d30deca719b-kube-api-access-xdhmq\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:37 crc kubenswrapper[4914]: I0130 21:33:37.879773 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/141de14b-d8ba-45b3-96de-efe388b9fc35-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.217031 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-g6c92" event={"ID":"83c5fce8-6d2b-47ae-ba0d-fbbc545de111","Type":"ContainerDied","Data":"a7a894b4b739eb885ba8d113d9f21e7b8b6c5254d55f617dd55f37e05756bb3f"} Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.217077 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7a894b4b739eb885ba8d113d9f21e7b8b6c5254d55f617dd55f37e05756bb3f" Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.217339 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-g6c92" Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.219892 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-m7dsr" Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.219886 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-m7dsr" event={"ID":"a9cc7a16-5905-4608-9212-2440d7235a11","Type":"ContainerDied","Data":"5fa14da00f8a827b1ea0164b481251fc5418252c94ac43c83826c557c0672755"} Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.220557 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fa14da00f8a827b1ea0164b481251fc5418252c94ac43c83826c557c0672755" Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.227895 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6019a332-1bf4-40c9-9ed7-6956d8532e9c","Type":"ContainerStarted","Data":"4aa17bb4b92882facc16aef8fcf9464f430c6576500780c94e87b5b333b00974"} Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.230139 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f2e8-account-create-update-tnvvs" event={"ID":"141de14b-d8ba-45b3-96de-efe388b9fc35","Type":"ContainerDied","Data":"dd4b9d2b9577812b2910e57718098439fbc9542a8151c1f3f3c60331d032ada7"} Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.230174 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd4b9d2b9577812b2910e57718098439fbc9542a8151c1f3f3c60331d032ada7" Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.230323 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f2e8-account-create-update-tnvvs" Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.232450 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8w2bj" event={"ID":"fb47ba3a-cc1d-48fb-9974-0d30deca719b","Type":"ContainerDied","Data":"3940193fff5cca0c1b5cd15aae9caab37489b253e40e6d1437d45fbbc041ef61"} Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.232491 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3940193fff5cca0c1b5cd15aae9caab37489b253e40e6d1437d45fbbc041ef61" Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.232581 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8w2bj" Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.236177 4914 generic.go:334] "Generic (PLEG): container finished" podID="cf598ffa-7a5d-4a7b-a547-cbf01cdefc25" containerID="11c7a966dafdee6a0789c632fb8ad3b29eeadbcb5982dcdbe340ca69caf078a4" exitCode=0 Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.236385 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lx7cf" event={"ID":"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25","Type":"ContainerDied","Data":"11c7a966dafdee6a0789c632fb8ad3b29eeadbcb5982dcdbe340ca69caf078a4"} Jan 30 21:33:38 crc kubenswrapper[4914]: I0130 21:33:38.291784 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.29175651 podStartE2EDuration="17.29175651s" podCreationTimestamp="2026-01-30 21:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:38.276527644 +0000 UTC m=+1151.715164445" watchObservedRunningTime="2026-01-30 21:33:38.29175651 +0000 UTC m=+1151.730393321" Jan 30 21:33:39 crc kubenswrapper[4914]: I0130 21:33:39.246382 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k9mzs" event={"ID":"66ea5989-2da6-4da0-adb5-91da8e9e2779","Type":"ContainerStarted","Data":"3b037bd88c1966f96490390d255c0435e12d2127123da6b735bc98001463fc16"} Jan 30 21:33:39 crc kubenswrapper[4914]: I0130 21:33:39.272567 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-k9mzs" podStartSLOduration=2.809956654 podStartE2EDuration="11.272549651s" podCreationTimestamp="2026-01-30 21:33:28 +0000 UTC" firstStartedPulling="2026-01-30 21:33:29.654287313 +0000 UTC m=+1143.092924074" lastFinishedPulling="2026-01-30 21:33:38.11688027 +0000 UTC m=+1151.555517071" observedRunningTime="2026-01-30 21:33:39.268070113 +0000 UTC m=+1152.706706884" watchObservedRunningTime="2026-01-30 21:33:39.272549651 +0000 UTC m=+1152.711186412" Jan 30 21:33:39 crc kubenswrapper[4914]: I0130 21:33:39.783756 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:39 crc kubenswrapper[4914]: I0130 21:33:39.922219 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-db-sync-config-data\") pod \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " Jan 30 21:33:39 crc kubenswrapper[4914]: I0130 21:33:39.922339 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csrm7\" (UniqueName: \"kubernetes.io/projected/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-kube-api-access-csrm7\") pod \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " Jan 30 21:33:39 crc kubenswrapper[4914]: I0130 21:33:39.922430 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-combined-ca-bundle\") pod \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " Jan 30 21:33:39 crc kubenswrapper[4914]: I0130 21:33:39.922669 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-config-data\") pod \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\" (UID: \"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25\") " Jan 30 21:33:39 crc kubenswrapper[4914]: I0130 21:33:39.929928 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-kube-api-access-csrm7" (OuterVolumeSpecName: "kube-api-access-csrm7") pod "cf598ffa-7a5d-4a7b-a547-cbf01cdefc25" (UID: "cf598ffa-7a5d-4a7b-a547-cbf01cdefc25"). InnerVolumeSpecName "kube-api-access-csrm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:39 crc kubenswrapper[4914]: I0130 21:33:39.931193 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cf598ffa-7a5d-4a7b-a547-cbf01cdefc25" (UID: "cf598ffa-7a5d-4a7b-a547-cbf01cdefc25"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:33:39 crc kubenswrapper[4914]: I0130 21:33:39.970078 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf598ffa-7a5d-4a7b-a547-cbf01cdefc25" (UID: "cf598ffa-7a5d-4a7b-a547-cbf01cdefc25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:33:39 crc kubenswrapper[4914]: I0130 21:33:39.995396 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-config-data" (OuterVolumeSpecName: "config-data") pod "cf598ffa-7a5d-4a7b-a547-cbf01cdefc25" (UID: "cf598ffa-7a5d-4a7b-a547-cbf01cdefc25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.031986 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.032024 4914 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.032038 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csrm7\" (UniqueName: \"kubernetes.io/projected/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-kube-api-access-csrm7\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.032052 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.255822 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lx7cf" event={"ID":"cf598ffa-7a5d-4a7b-a547-cbf01cdefc25","Type":"ContainerDied","Data":"c050940a536bade19febd4bc1957897cafb8c3ebf0ae0d95f305456518d908d5"} Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.255868 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c050940a536bade19febd4bc1957897cafb8c3ebf0ae0d95f305456518d908d5" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.255836 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lx7cf" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.687899 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.719949 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4dddm"] Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.751629 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-2bfvl"] Jan 30 21:33:40 crc kubenswrapper[4914]: E0130 21:33:40.753851 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6451aee5-930f-41e5-8a8c-8b50c3b3a887" containerName="mariadb-database-create" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.753874 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6451aee5-930f-41e5-8a8c-8b50c3b3a887" containerName="mariadb-database-create" Jan 30 21:33:40 crc kubenswrapper[4914]: E0130 21:33:40.753897 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9cc7a16-5905-4608-9212-2440d7235a11" containerName="mariadb-database-create" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.753903 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9cc7a16-5905-4608-9212-2440d7235a11" containerName="mariadb-database-create" Jan 30 21:33:40 crc kubenswrapper[4914]: E0130 21:33:40.753914 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83c5fce8-6d2b-47ae-ba0d-fbbc545de111" containerName="mariadb-database-create" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.753921 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c5fce8-6d2b-47ae-ba0d-fbbc545de111" containerName="mariadb-database-create" Jan 30 21:33:40 crc kubenswrapper[4914]: E0130 21:33:40.753934 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="141de14b-d8ba-45b3-96de-efe388b9fc35" containerName="mariadb-account-create-update" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.753940 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="141de14b-d8ba-45b3-96de-efe388b9fc35" containerName="mariadb-account-create-update" Jan 30 21:33:40 crc kubenswrapper[4914]: E0130 21:33:40.753951 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001ceb92-59d0-496f-8331-51d4e131b419" containerName="mariadb-account-create-update" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.753965 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="001ceb92-59d0-496f-8331-51d4e131b419" containerName="mariadb-account-create-update" Jan 30 21:33:40 crc kubenswrapper[4914]: E0130 21:33:40.753976 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb47ba3a-cc1d-48fb-9974-0d30deca719b" containerName="mariadb-database-create" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.753983 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb47ba3a-cc1d-48fb-9974-0d30deca719b" containerName="mariadb-database-create" Jan 30 21:33:40 crc kubenswrapper[4914]: E0130 21:33:40.753993 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580c0e5e-259e-4164-ac77-ca625b915ffa" containerName="mariadb-account-create-update" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.753999 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="580c0e5e-259e-4164-ac77-ca625b915ffa" containerName="mariadb-account-create-update" Jan 30 21:33:40 crc kubenswrapper[4914]: E0130 21:33:40.754009 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="562c29cc-4430-4d60-9577-92ff7849353f" containerName="mariadb-account-create-update" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.754014 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="562c29cc-4430-4d60-9577-92ff7849353f" containerName="mariadb-account-create-update" Jan 30 21:33:40 crc kubenswrapper[4914]: E0130 21:33:40.754022 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf598ffa-7a5d-4a7b-a547-cbf01cdefc25" containerName="glance-db-sync" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.754029 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf598ffa-7a5d-4a7b-a547-cbf01cdefc25" containerName="glance-db-sync" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.754185 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6451aee5-930f-41e5-8a8c-8b50c3b3a887" containerName="mariadb-database-create" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.754194 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="141de14b-d8ba-45b3-96de-efe388b9fc35" containerName="mariadb-account-create-update" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.754202 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9cc7a16-5905-4608-9212-2440d7235a11" containerName="mariadb-database-create" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.754212 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="83c5fce8-6d2b-47ae-ba0d-fbbc545de111" containerName="mariadb-database-create" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.754222 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb47ba3a-cc1d-48fb-9974-0d30deca719b" containerName="mariadb-database-create" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.754229 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="580c0e5e-259e-4164-ac77-ca625b915ffa" containerName="mariadb-account-create-update" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.754237 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="001ceb92-59d0-496f-8331-51d4e131b419" containerName="mariadb-account-create-update" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.754248 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf598ffa-7a5d-4a7b-a547-cbf01cdefc25" containerName="glance-db-sync" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.754262 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="562c29cc-4430-4d60-9577-92ff7849353f" containerName="mariadb-account-create-update" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.755192 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.779139 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-2bfvl"] Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.863449 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.863832 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-config\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.863989 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.864151 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.864293 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkgrg\" (UniqueName: \"kubernetes.io/projected/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-kube-api-access-zkgrg\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.864652 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.967385 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.967503 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.967552 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-config\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.967578 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.967640 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.967696 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkgrg\" (UniqueName: \"kubernetes.io/projected/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-kube-api-access-zkgrg\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.968266 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.968827 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-config\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.968888 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.969084 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.969408 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:40 crc kubenswrapper[4914]: I0130 21:33:40.993591 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkgrg\" (UniqueName: \"kubernetes.io/projected/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-kube-api-access-zkgrg\") pod \"dnsmasq-dns-74f6bcbc87-2bfvl\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.077140 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.264261 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-4dddm" podUID="73907d8c-bf5d-4654-9a14-ae335ab89b11" containerName="dnsmasq-dns" containerID="cri-o://bc845b9fc95b7ebba1c5ebb7a8c988ae8c6086119916f7e132dd340cc1b9e090" gracePeriod=10 Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.575436 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-2bfvl"] Jan 30 21:33:41 crc kubenswrapper[4914]: W0130 21:33:41.596920 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca156aa5_6d21_4bae_9a50_7df08a4ee1fd.slice/crio-2cdb37fd8dab2b3966aa233fccd4166d8ebcb41df469866058c7445ea5f3cc92 WatchSource:0}: Error finding container 2cdb37fd8dab2b3966aa233fccd4166d8ebcb41df469866058c7445ea5f3cc92: Status 404 returned error can't find the container with id 2cdb37fd8dab2b3966aa233fccd4166d8ebcb41df469866058c7445ea5f3cc92 Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.683177 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.780925 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-dns-svc\") pod \"73907d8c-bf5d-4654-9a14-ae335ab89b11\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.780976 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-config\") pod \"73907d8c-bf5d-4654-9a14-ae335ab89b11\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.781028 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-dns-swift-storage-0\") pod \"73907d8c-bf5d-4654-9a14-ae335ab89b11\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.781215 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-ovsdbserver-nb\") pod \"73907d8c-bf5d-4654-9a14-ae335ab89b11\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.781265 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-ovsdbserver-sb\") pod \"73907d8c-bf5d-4654-9a14-ae335ab89b11\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.781494 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgcfm\" (UniqueName: \"kubernetes.io/projected/73907d8c-bf5d-4654-9a14-ae335ab89b11-kube-api-access-tgcfm\") pod \"73907d8c-bf5d-4654-9a14-ae335ab89b11\" (UID: \"73907d8c-bf5d-4654-9a14-ae335ab89b11\") " Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.787799 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73907d8c-bf5d-4654-9a14-ae335ab89b11-kube-api-access-tgcfm" (OuterVolumeSpecName: "kube-api-access-tgcfm") pod "73907d8c-bf5d-4654-9a14-ae335ab89b11" (UID: "73907d8c-bf5d-4654-9a14-ae335ab89b11"). InnerVolumeSpecName "kube-api-access-tgcfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.833271 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73907d8c-bf5d-4654-9a14-ae335ab89b11" (UID: "73907d8c-bf5d-4654-9a14-ae335ab89b11"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.838996 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "73907d8c-bf5d-4654-9a14-ae335ab89b11" (UID: "73907d8c-bf5d-4654-9a14-ae335ab89b11"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.842087 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "73907d8c-bf5d-4654-9a14-ae335ab89b11" (UID: "73907d8c-bf5d-4654-9a14-ae335ab89b11"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.846797 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "73907d8c-bf5d-4654-9a14-ae335ab89b11" (UID: "73907d8c-bf5d-4654-9a14-ae335ab89b11"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.853184 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-config" (OuterVolumeSpecName: "config") pod "73907d8c-bf5d-4654-9a14-ae335ab89b11" (UID: "73907d8c-bf5d-4654-9a14-ae335ab89b11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.884675 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.884729 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.884747 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgcfm\" (UniqueName: \"kubernetes.io/projected/73907d8c-bf5d-4654-9a14-ae335ab89b11-kube-api-access-tgcfm\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.884765 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.884776 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.884786 4914 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73907d8c-bf5d-4654-9a14-ae335ab89b11-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:41 crc kubenswrapper[4914]: I0130 21:33:41.947266 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.274318 4914 generic.go:334] "Generic (PLEG): container finished" podID="73907d8c-bf5d-4654-9a14-ae335ab89b11" containerID="bc845b9fc95b7ebba1c5ebb7a8c988ae8c6086119916f7e132dd340cc1b9e090" exitCode=0 Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.274385 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-4dddm" event={"ID":"73907d8c-bf5d-4654-9a14-ae335ab89b11","Type":"ContainerDied","Data":"bc845b9fc95b7ebba1c5ebb7a8c988ae8c6086119916f7e132dd340cc1b9e090"} Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.274416 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-4dddm" event={"ID":"73907d8c-bf5d-4654-9a14-ae335ab89b11","Type":"ContainerDied","Data":"aec72491671e70eecca16f79bfcdf79633def18a70ac53bddd61de1a3681d5e5"} Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.274438 4914 scope.go:117] "RemoveContainer" containerID="bc845b9fc95b7ebba1c5ebb7a8c988ae8c6086119916f7e132dd340cc1b9e090" Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.274578 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-4dddm" Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.277396 4914 generic.go:334] "Generic (PLEG): container finished" podID="ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" containerID="32c04cef9dac91aa57e6fbe983e6787c29652d7bd3c2f31f5f5a2a465259fdf1" exitCode=0 Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.277444 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" event={"ID":"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd","Type":"ContainerDied","Data":"32c04cef9dac91aa57e6fbe983e6787c29652d7bd3c2f31f5f5a2a465259fdf1"} Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.277475 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" event={"ID":"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd","Type":"ContainerStarted","Data":"2cdb37fd8dab2b3966aa233fccd4166d8ebcb41df469866058c7445ea5f3cc92"} Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.324987 4914 scope.go:117] "RemoveContainer" containerID="45d0e07816f271342dc79f5313c1813d1d1855c5e28c69c68fcc7e0baccf8325" Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.440254 4914 scope.go:117] "RemoveContainer" containerID="bc845b9fc95b7ebba1c5ebb7a8c988ae8c6086119916f7e132dd340cc1b9e090" Jan 30 21:33:42 crc kubenswrapper[4914]: E0130 21:33:42.440806 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc845b9fc95b7ebba1c5ebb7a8c988ae8c6086119916f7e132dd340cc1b9e090\": container with ID starting with bc845b9fc95b7ebba1c5ebb7a8c988ae8c6086119916f7e132dd340cc1b9e090 not found: ID does not exist" containerID="bc845b9fc95b7ebba1c5ebb7a8c988ae8c6086119916f7e132dd340cc1b9e090" Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.440850 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc845b9fc95b7ebba1c5ebb7a8c988ae8c6086119916f7e132dd340cc1b9e090"} err="failed to get container status \"bc845b9fc95b7ebba1c5ebb7a8c988ae8c6086119916f7e132dd340cc1b9e090\": rpc error: code = NotFound desc = could not find container \"bc845b9fc95b7ebba1c5ebb7a8c988ae8c6086119916f7e132dd340cc1b9e090\": container with ID starting with bc845b9fc95b7ebba1c5ebb7a8c988ae8c6086119916f7e132dd340cc1b9e090 not found: ID does not exist" Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.440890 4914 scope.go:117] "RemoveContainer" containerID="45d0e07816f271342dc79f5313c1813d1d1855c5e28c69c68fcc7e0baccf8325" Jan 30 21:33:42 crc kubenswrapper[4914]: E0130 21:33:42.441218 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45d0e07816f271342dc79f5313c1813d1d1855c5e28c69c68fcc7e0baccf8325\": container with ID starting with 45d0e07816f271342dc79f5313c1813d1d1855c5e28c69c68fcc7e0baccf8325 not found: ID does not exist" containerID="45d0e07816f271342dc79f5313c1813d1d1855c5e28c69c68fcc7e0baccf8325" Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.441236 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d0e07816f271342dc79f5313c1813d1d1855c5e28c69c68fcc7e0baccf8325"} err="failed to get container status \"45d0e07816f271342dc79f5313c1813d1d1855c5e28c69c68fcc7e0baccf8325\": rpc error: code = NotFound desc = could not find container \"45d0e07816f271342dc79f5313c1813d1d1855c5e28c69c68fcc7e0baccf8325\": container with ID starting with 45d0e07816f271342dc79f5313c1813d1d1855c5e28c69c68fcc7e0baccf8325 not found: ID does not exist" Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.523767 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4dddm"] Jan 30 21:33:42 crc kubenswrapper[4914]: I0130 21:33:42.531330 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-4dddm"] Jan 30 21:33:43 crc kubenswrapper[4914]: I0130 21:33:43.320088 4914 generic.go:334] "Generic (PLEG): container finished" podID="66ea5989-2da6-4da0-adb5-91da8e9e2779" containerID="3b037bd88c1966f96490390d255c0435e12d2127123da6b735bc98001463fc16" exitCode=0 Jan 30 21:33:43 crc kubenswrapper[4914]: I0130 21:33:43.320400 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k9mzs" event={"ID":"66ea5989-2da6-4da0-adb5-91da8e9e2779","Type":"ContainerDied","Data":"3b037bd88c1966f96490390d255c0435e12d2127123da6b735bc98001463fc16"} Jan 30 21:33:43 crc kubenswrapper[4914]: I0130 21:33:43.326191 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" event={"ID":"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd","Type":"ContainerStarted","Data":"722b6e0a09c5c92aeec61dd241bf9a6841a9d073a4757595813def54d8869400"} Jan 30 21:33:43 crc kubenswrapper[4914]: I0130 21:33:43.326350 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:43 crc kubenswrapper[4914]: I0130 21:33:43.363606 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" podStartSLOduration=3.363587374 podStartE2EDuration="3.363587374s" podCreationTimestamp="2026-01-30 21:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:43.36176724 +0000 UTC m=+1156.800404001" watchObservedRunningTime="2026-01-30 21:33:43.363587374 +0000 UTC m=+1156.802224135" Jan 30 21:33:43 crc kubenswrapper[4914]: I0130 21:33:43.830727 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73907d8c-bf5d-4654-9a14-ae335ab89b11" path="/var/lib/kubelet/pods/73907d8c-bf5d-4654-9a14-ae335ab89b11/volumes" Jan 30 21:33:44 crc kubenswrapper[4914]: I0130 21:33:44.736682 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k9mzs" Jan 30 21:33:44 crc kubenswrapper[4914]: I0130 21:33:44.855254 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ea5989-2da6-4da0-adb5-91da8e9e2779-combined-ca-bundle\") pod \"66ea5989-2da6-4da0-adb5-91da8e9e2779\" (UID: \"66ea5989-2da6-4da0-adb5-91da8e9e2779\") " Jan 30 21:33:44 crc kubenswrapper[4914]: I0130 21:33:44.855305 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66ea5989-2da6-4da0-adb5-91da8e9e2779-config-data\") pod \"66ea5989-2da6-4da0-adb5-91da8e9e2779\" (UID: \"66ea5989-2da6-4da0-adb5-91da8e9e2779\") " Jan 30 21:33:44 crc kubenswrapper[4914]: I0130 21:33:44.855476 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp5gl\" (UniqueName: \"kubernetes.io/projected/66ea5989-2da6-4da0-adb5-91da8e9e2779-kube-api-access-bp5gl\") pod \"66ea5989-2da6-4da0-adb5-91da8e9e2779\" (UID: \"66ea5989-2da6-4da0-adb5-91da8e9e2779\") " Jan 30 21:33:44 crc kubenswrapper[4914]: I0130 21:33:44.869014 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ea5989-2da6-4da0-adb5-91da8e9e2779-kube-api-access-bp5gl" (OuterVolumeSpecName: "kube-api-access-bp5gl") pod "66ea5989-2da6-4da0-adb5-91da8e9e2779" (UID: "66ea5989-2da6-4da0-adb5-91da8e9e2779"). InnerVolumeSpecName "kube-api-access-bp5gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:44 crc kubenswrapper[4914]: I0130 21:33:44.901903 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ea5989-2da6-4da0-adb5-91da8e9e2779-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66ea5989-2da6-4da0-adb5-91da8e9e2779" (UID: "66ea5989-2da6-4da0-adb5-91da8e9e2779"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:33:44 crc kubenswrapper[4914]: I0130 21:33:44.926690 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ea5989-2da6-4da0-adb5-91da8e9e2779-config-data" (OuterVolumeSpecName: "config-data") pod "66ea5989-2da6-4da0-adb5-91da8e9e2779" (UID: "66ea5989-2da6-4da0-adb5-91da8e9e2779"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:33:44 crc kubenswrapper[4914]: I0130 21:33:44.957499 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ea5989-2da6-4da0-adb5-91da8e9e2779-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:44 crc kubenswrapper[4914]: I0130 21:33:44.957530 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66ea5989-2da6-4da0-adb5-91da8e9e2779-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:44 crc kubenswrapper[4914]: I0130 21:33:44.957539 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp5gl\" (UniqueName: \"kubernetes.io/projected/66ea5989-2da6-4da0-adb5-91da8e9e2779-kube-api-access-bp5gl\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.349944 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k9mzs" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.349929 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k9mzs" event={"ID":"66ea5989-2da6-4da0-adb5-91da8e9e2779","Type":"ContainerDied","Data":"ec33a78a8b840ec3cc1114dde571a43cd2c0fb3039486ba01e9c90d78fc4af74"} Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.350065 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec33a78a8b840ec3cc1114dde571a43cd2c0fb3039486ba01e9c90d78fc4af74" Jan 30 21:33:45 crc kubenswrapper[4914]: E0130 21:33:45.397482 4914 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66ea5989_2da6_4da0_adb5_91da8e9e2779.slice/crio-ec33a78a8b840ec3cc1114dde571a43cd2c0fb3039486ba01e9c90d78fc4af74\": RecentStats: unable to find data in memory cache]" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.640635 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8crdx"] Jan 30 21:33:45 crc kubenswrapper[4914]: E0130 21:33:45.641302 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ea5989-2da6-4da0-adb5-91da8e9e2779" containerName="keystone-db-sync" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.641318 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ea5989-2da6-4da0-adb5-91da8e9e2779" containerName="keystone-db-sync" Jan 30 21:33:45 crc kubenswrapper[4914]: E0130 21:33:45.641335 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73907d8c-bf5d-4654-9a14-ae335ab89b11" containerName="init" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.641342 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="73907d8c-bf5d-4654-9a14-ae335ab89b11" containerName="init" Jan 30 21:33:45 crc kubenswrapper[4914]: E0130 21:33:45.641360 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73907d8c-bf5d-4654-9a14-ae335ab89b11" containerName="dnsmasq-dns" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.641366 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="73907d8c-bf5d-4654-9a14-ae335ab89b11" containerName="dnsmasq-dns" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.641536 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ea5989-2da6-4da0-adb5-91da8e9e2779" containerName="keystone-db-sync" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.641547 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="73907d8c-bf5d-4654-9a14-ae335ab89b11" containerName="dnsmasq-dns" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.642334 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.645025 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.645291 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tzbth" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.645450 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.645562 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.646183 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.661034 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-2bfvl"] Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.669034 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-scripts\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.669169 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-combined-ca-bundle\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.669212 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-config-data\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.669304 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-fernet-keys\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.669394 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-credential-keys\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.669521 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw6z8\" (UniqueName: \"kubernetes.io/projected/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-kube-api-access-zw6z8\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.674110 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8crdx"] Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.696987 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-hqnfc"] Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.698439 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.740830 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-hqnfc"] Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.771848 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw6z8\" (UniqueName: \"kubernetes.io/projected/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-kube-api-access-zw6z8\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.771908 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.771953 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.771997 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.772025 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-config\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.772080 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-scripts\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.772179 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-combined-ca-bundle\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.772223 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-config-data\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.772264 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-fernet-keys\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.772308 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-credential-keys\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.772348 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-dns-svc\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.772395 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md5nj\" (UniqueName: \"kubernetes.io/projected/d06bdedd-3c56-4294-9871-68021a8504d0-kube-api-access-md5nj\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.792501 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-scripts\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.812532 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-fernet-keys\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.869528 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-combined-ca-bundle\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.877463 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-credential-keys\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.877520 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw6z8\" (UniqueName: \"kubernetes.io/projected/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-kube-api-access-zw6z8\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.878175 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-config-data\") pod \"keystone-bootstrap-8crdx\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.885508 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-dns-svc\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.885619 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md5nj\" (UniqueName: \"kubernetes.io/projected/d06bdedd-3c56-4294-9871-68021a8504d0-kube-api-access-md5nj\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.885717 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.885777 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.885813 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.885837 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-config\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.888471 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-config\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.891576 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.892146 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.892699 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-dns-svc\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.896497 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:45 crc kubenswrapper[4914]: I0130 21:33:45.970204 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.007563 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md5nj\" (UniqueName: \"kubernetes.io/projected/d06bdedd-3c56-4294-9871-68021a8504d0-kube-api-access-md5nj\") pod \"dnsmasq-dns-847c4cc679-hqnfc\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.022527 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6kskl"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.023896 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.029097 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.039277 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.039515 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wcgnb" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.039656 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.075388 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.077628 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.087870 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.091599 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.091884 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.108781 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6kskl"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.140544 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-46tqv"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.141790 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-46tqv" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.144281 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.145168 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wkslc" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.145333 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.148402 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-t4kd2"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.149674 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.164156 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-tfj5s" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.164776 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.164875 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-46tqv"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.164960 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.170769 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.171934 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-t4kd2"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.187477 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-hqnfc"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.196486 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x48gj\" (UniqueName: \"kubernetes.io/projected/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-kube-api-access-x48gj\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.196508 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-6c9nl"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.196521 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.196623 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-etc-machine-id\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.196644 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-log-httpd\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.196679 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-db-sync-config-data\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.196789 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-config-data\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.196809 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-scripts\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.196825 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-scripts\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.196853 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-combined-ca-bundle\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.196895 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-config-data\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.196913 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-run-httpd\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.196944 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.196985 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jxxd\" (UniqueName: \"kubernetes.io/projected/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-kube-api-access-4jxxd\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.197806 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6c9nl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.203645 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6c9nl"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.221533 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9b5p4" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.221872 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.308655 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6748bae8-dcab-4fdb-ab49-b60893908a7f-combined-ca-bundle\") pod \"neutron-db-sync-46tqv\" (UID: \"6748bae8-dcab-4fdb-ab49-b60893908a7f\") " pod="openstack/neutron-db-sync-46tqv" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.308764 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqvqs\" (UniqueName: \"kubernetes.io/projected/f4afaeee-72ae-4c47-b842-d201151915c4-kube-api-access-vqvqs\") pod \"barbican-db-sync-6c9nl\" (UID: \"f4afaeee-72ae-4c47-b842-d201151915c4\") " pod="openstack/barbican-db-sync-6c9nl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.308809 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-config-data\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.308841 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-run-httpd\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.308925 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.308971 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-config-data\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.308997 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jxxd\" (UniqueName: \"kubernetes.io/projected/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-kube-api-access-4jxxd\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309061 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a0548f63-8249-4708-88d9-b3f663b28778-certs\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309094 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x48gj\" (UniqueName: \"kubernetes.io/projected/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-kube-api-access-x48gj\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309133 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309155 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-etc-machine-id\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309180 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4afaeee-72ae-4c47-b842-d201151915c4-db-sync-config-data\") pod \"barbican-db-sync-6c9nl\" (UID: \"f4afaeee-72ae-4c47-b842-d201151915c4\") " pod="openstack/barbican-db-sync-6c9nl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309216 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6748bae8-dcab-4fdb-ab49-b60893908a7f-config\") pod \"neutron-db-sync-46tqv\" (UID: \"6748bae8-dcab-4fdb-ab49-b60893908a7f\") " pod="openstack/neutron-db-sync-46tqv" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309248 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-log-httpd\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309301 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-combined-ca-bundle\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309356 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-scripts\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309378 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmt8w\" (UniqueName: \"kubernetes.io/projected/6748bae8-dcab-4fdb-ab49-b60893908a7f-kube-api-access-qmt8w\") pod \"neutron-db-sync-46tqv\" (UID: \"6748bae8-dcab-4fdb-ab49-b60893908a7f\") " pod="openstack/neutron-db-sync-46tqv" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309431 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-db-sync-config-data\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309492 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-config-data\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309521 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-scripts\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309541 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-scripts\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309579 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4afaeee-72ae-4c47-b842-d201151915c4-combined-ca-bundle\") pod \"barbican-db-sync-6c9nl\" (UID: \"f4afaeee-72ae-4c47-b842-d201151915c4\") " pod="openstack/barbican-db-sync-6c9nl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309609 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-combined-ca-bundle\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.309653 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knbtb\" (UniqueName: \"kubernetes.io/projected/a0548f63-8249-4708-88d9-b3f663b28778-kube-api-access-knbtb\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.313299 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-run-httpd\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.313330 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-log-httpd\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.314011 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-etc-machine-id\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.326424 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-9zkpj"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.326759 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-scripts\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.327516 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.330687 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-combined-ca-bundle\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.333189 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.334179 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-scripts\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.334726 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-db-sync-config-data\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.347689 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.354815 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-config-data\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.358388 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-config-data\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.368634 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dkdht"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.370902 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.371395 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nhc5p" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.388623 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" podUID="ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" containerName="dnsmasq-dns" containerID="cri-o://722b6e0a09c5c92aeec61dd241bf9a6841a9d073a4757595813def54d8869400" gracePeriod=10 Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.411181 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-combined-ca-bundle\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.411225 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-scripts\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.411245 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmt8w\" (UniqueName: \"kubernetes.io/projected/6748bae8-dcab-4fdb-ab49-b60893908a7f-kube-api-access-qmt8w\") pod \"neutron-db-sync-46tqv\" (UID: \"6748bae8-dcab-4fdb-ab49-b60893908a7f\") " pod="openstack/neutron-db-sync-46tqv" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.411289 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4afaeee-72ae-4c47-b842-d201151915c4-combined-ca-bundle\") pod \"barbican-db-sync-6c9nl\" (UID: \"f4afaeee-72ae-4c47-b842-d201151915c4\") " pod="openstack/barbican-db-sync-6c9nl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.411315 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knbtb\" (UniqueName: \"kubernetes.io/projected/a0548f63-8249-4708-88d9-b3f663b28778-kube-api-access-knbtb\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.411346 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6748bae8-dcab-4fdb-ab49-b60893908a7f-combined-ca-bundle\") pod \"neutron-db-sync-46tqv\" (UID: \"6748bae8-dcab-4fdb-ab49-b60893908a7f\") " pod="openstack/neutron-db-sync-46tqv" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.411370 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqvqs\" (UniqueName: \"kubernetes.io/projected/f4afaeee-72ae-4c47-b842-d201151915c4-kube-api-access-vqvqs\") pod \"barbican-db-sync-6c9nl\" (UID: \"f4afaeee-72ae-4c47-b842-d201151915c4\") " pod="openstack/barbican-db-sync-6c9nl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.411647 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-config-data\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.411683 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a0548f63-8249-4708-88d9-b3f663b28778-certs\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.411736 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4afaeee-72ae-4c47-b842-d201151915c4-db-sync-config-data\") pod \"barbican-db-sync-6c9nl\" (UID: \"f4afaeee-72ae-4c47-b842-d201151915c4\") " pod="openstack/barbican-db-sync-6c9nl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.411754 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6748bae8-dcab-4fdb-ab49-b60893908a7f-config\") pod \"neutron-db-sync-46tqv\" (UID: \"6748bae8-dcab-4fdb-ab49-b60893908a7f\") " pod="openstack/neutron-db-sync-46tqv" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.415981 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-config-data\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.430294 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4afaeee-72ae-4c47-b842-d201151915c4-db-sync-config-data\") pod \"barbican-db-sync-6c9nl\" (UID: \"f4afaeee-72ae-4c47-b842-d201151915c4\") " pod="openstack/barbican-db-sync-6c9nl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.442313 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a0548f63-8249-4708-88d9-b3f663b28778-certs\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.445420 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-scripts\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.445924 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4afaeee-72ae-4c47-b842-d201151915c4-combined-ca-bundle\") pod \"barbican-db-sync-6c9nl\" (UID: \"f4afaeee-72ae-4c47-b842-d201151915c4\") " pod="openstack/barbican-db-sync-6c9nl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.446339 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-combined-ca-bundle\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.446387 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6748bae8-dcab-4fdb-ab49-b60893908a7f-config\") pod \"neutron-db-sync-46tqv\" (UID: \"6748bae8-dcab-4fdb-ab49-b60893908a7f\") " pod="openstack/neutron-db-sync-46tqv" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.446760 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6748bae8-dcab-4fdb-ab49-b60893908a7f-combined-ca-bundle\") pod \"neutron-db-sync-46tqv\" (UID: \"6748bae8-dcab-4fdb-ab49-b60893908a7f\") " pod="openstack/neutron-db-sync-46tqv" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.446913 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.447646 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.447774 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9zkpj"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.452096 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x48gj\" (UniqueName: \"kubernetes.io/projected/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-kube-api-access-x48gj\") pod \"cinder-db-sync-6kskl\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.495998 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6kskl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.507372 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knbtb\" (UniqueName: \"kubernetes.io/projected/a0548f63-8249-4708-88d9-b3f663b28778-kube-api-access-knbtb\") pod \"cloudkitty-db-sync-t4kd2\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.508326 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jxxd\" (UniqueName: \"kubernetes.io/projected/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-kube-api-access-4jxxd\") pod \"ceilometer-0\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.509238 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmt8w\" (UniqueName: \"kubernetes.io/projected/6748bae8-dcab-4fdb-ab49-b60893908a7f-kube-api-access-qmt8w\") pod \"neutron-db-sync-46tqv\" (UID: \"6748bae8-dcab-4fdb-ab49-b60893908a7f\") " pod="openstack/neutron-db-sync-46tqv" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.514407 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.514465 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-combined-ca-bundle\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.514512 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-scripts\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.514571 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.514597 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jchbv\" (UniqueName: \"kubernetes.io/projected/25c531db-a02c-477b-b968-2f086a8443e8-kube-api-access-jchbv\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.514655 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.514678 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cktjv\" (UniqueName: \"kubernetes.io/projected/da9750a0-8f17-4cf7-9935-5da9c43a9a48-kube-api-access-cktjv\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.514737 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25c531db-a02c-477b-b968-2f086a8443e8-logs\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.514760 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-config\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.514784 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.514806 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-config-data\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.520521 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dkdht"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.524008 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.529069 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqvqs\" (UniqueName: \"kubernetes.io/projected/f4afaeee-72ae-4c47-b842-d201151915c4-kube-api-access-vqvqs\") pod \"barbican-db-sync-6c9nl\" (UID: \"f4afaeee-72ae-4c47-b842-d201151915c4\") " pod="openstack/barbican-db-sync-6c9nl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.573874 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-46tqv" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.599187 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.617140 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.617181 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jchbv\" (UniqueName: \"kubernetes.io/projected/25c531db-a02c-477b-b968-2f086a8443e8-kube-api-access-jchbv\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.617232 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.617248 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cktjv\" (UniqueName: \"kubernetes.io/projected/da9750a0-8f17-4cf7-9935-5da9c43a9a48-kube-api-access-cktjv\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.617282 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25c531db-a02c-477b-b968-2f086a8443e8-logs\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.617301 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-config\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.617316 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.617336 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-config-data\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.617404 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.617421 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-combined-ca-bundle\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.617447 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-scripts\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.621925 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25c531db-a02c-477b-b968-2f086a8443e8-logs\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.622911 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.624032 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.634359 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.636580 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-config\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.638135 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.649612 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-scripts\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.650231 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jchbv\" (UniqueName: \"kubernetes.io/projected/25c531db-a02c-477b-b968-2f086a8443e8-kube-api-access-jchbv\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.652300 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-combined-ca-bundle\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.649183 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-config-data\") pod \"placement-db-sync-9zkpj\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.676198 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6c9nl" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.688969 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cktjv\" (UniqueName: \"kubernetes.io/projected/da9750a0-8f17-4cf7-9935-5da9c43a9a48-kube-api-access-cktjv\") pod \"dnsmasq-dns-785d8bcb8c-dkdht\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.709216 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9zkpj" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.815548 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.823528 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.826096 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.830121 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.830270 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.830453 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2msrh" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.830486 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.830622 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.924017 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f9dd155-1062-4e85-97ec-405d37ba8d47-logs\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.924312 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.924438 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.924456 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-scripts\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.924529 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j88rg\" (UniqueName: \"kubernetes.io/projected/1f9dd155-1062-4e85-97ec-405d37ba8d47-kube-api-access-j88rg\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.925401 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1f9dd155-1062-4e85-97ec-405d37ba8d47-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.925544 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.925889 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-config-data\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:46 crc kubenswrapper[4914]: I0130 21:33:46.963963 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8crdx"] Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.015777 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.029002 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.029054 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-scripts\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.029135 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j88rg\" (UniqueName: \"kubernetes.io/projected/1f9dd155-1062-4e85-97ec-405d37ba8d47-kube-api-access-j88rg\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.029170 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1f9dd155-1062-4e85-97ec-405d37ba8d47-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.029246 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.029286 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-config-data\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.029308 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f9dd155-1062-4e85-97ec-405d37ba8d47-logs\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.029327 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.041004 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1f9dd155-1062-4e85-97ec-405d37ba8d47-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.055277 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.055319 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b72a388fd97a6544e59fcf27848209fccbec8612f776f09e5be6189768241719/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.075296 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-scripts\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.076079 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f9dd155-1062-4e85-97ec-405d37ba8d47-logs\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.076268 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.078942 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.083260 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.089403 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.096557 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-config-data\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.098425 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.098583 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.118804 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j88rg\" (UniqueName: \"kubernetes.io/projected/1f9dd155-1062-4e85-97ec-405d37ba8d47-kube-api-access-j88rg\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.215908 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") pod \"glance-default-external-api-0\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.236830 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb34dafa-3ea1-4548-a9a1-0b496152504c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.236875 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.237010 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.237066 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.237107 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.237233 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.237815 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc6z4\" (UniqueName: \"kubernetes.io/projected/cb34dafa-3ea1-4548-a9a1-0b496152504c-kube-api-access-vc6z4\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.237922 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb34dafa-3ea1-4548-a9a1-0b496152504c-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.339280 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.339350 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc6z4\" (UniqueName: \"kubernetes.io/projected/cb34dafa-3ea1-4548-a9a1-0b496152504c-kube-api-access-vc6z4\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.339448 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb34dafa-3ea1-4548-a9a1-0b496152504c-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.339507 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb34dafa-3ea1-4548-a9a1-0b496152504c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.339528 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.339564 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.339582 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.339602 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.342899 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb34dafa-3ea1-4548-a9a1-0b496152504c-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.343227 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb34dafa-3ea1-4548-a9a1-0b496152504c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.347647 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.347695 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/66c50567016faa78360e7f45b700987189e7fa2a9601532760fae56b995ba54f/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.349376 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.350057 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.361493 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.361988 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.365922 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc6z4\" (UniqueName: \"kubernetes.io/projected/cb34dafa-3ea1-4548-a9a1-0b496152504c-kube-api-access-vc6z4\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.399325 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-hqnfc"] Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.428948 4914 generic.go:334] "Generic (PLEG): container finished" podID="ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" containerID="722b6e0a09c5c92aeec61dd241bf9a6841a9d073a4757595813def54d8869400" exitCode=0 Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.429021 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" event={"ID":"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd","Type":"ContainerDied","Data":"722b6e0a09c5c92aeec61dd241bf9a6841a9d073a4757595813def54d8869400"} Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.446173 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8crdx" event={"ID":"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47","Type":"ContainerStarted","Data":"f80df3240efe68ee921ef944a01dd5c5d68a4200f079a14cb6ee0a4ee18a8b6f"} Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.451085 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.460062 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") pod \"glance-default-internal-api-0\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.491780 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.510243 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.549080 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-config\") pod \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.549171 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkgrg\" (UniqueName: \"kubernetes.io/projected/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-kube-api-access-zkgrg\") pod \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.549208 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-dns-svc\") pod \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.549268 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-dns-swift-storage-0\") pod \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.549301 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-ovsdbserver-sb\") pod \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.549379 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-ovsdbserver-nb\") pod \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\" (UID: \"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd\") " Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.560938 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-kube-api-access-zkgrg" (OuterVolumeSpecName: "kube-api-access-zkgrg") pod "ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" (UID: "ca156aa5-6d21-4bae-9a50-7df08a4ee1fd"). InnerVolumeSpecName "kube-api-access-zkgrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.626728 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" (UID: "ca156aa5-6d21-4bae-9a50-7df08a4ee1fd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.638225 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-config" (OuterVolumeSpecName: "config") pod "ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" (UID: "ca156aa5-6d21-4bae-9a50-7df08a4ee1fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.644423 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" (UID: "ca156aa5-6d21-4bae-9a50-7df08a4ee1fd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.655264 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.655288 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.655299 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkgrg\" (UniqueName: \"kubernetes.io/projected/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-kube-api-access-zkgrg\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.655309 4914 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.683681 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" (UID: "ca156aa5-6d21-4bae-9a50-7df08a4ee1fd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.684331 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" (UID: "ca156aa5-6d21-4bae-9a50-7df08a4ee1fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.756824 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:47 crc kubenswrapper[4914]: I0130 21:33:47.757039 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.162278 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6kskl"] Jan 30 21:33:48 crc kubenswrapper[4914]: W0130 21:33:48.181949 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6748bae8_dcab_4fdb_ab49_b60893908a7f.slice/crio-8b57b201de81879e978a54221bce597132b6e15ab5017570185d67e7f3f865d1 WatchSource:0}: Error finding container 8b57b201de81879e978a54221bce597132b6e15ab5017570185d67e7f3f865d1: Status 404 returned error can't find the container with id 8b57b201de81879e978a54221bce597132b6e15ab5017570185d67e7f3f865d1 Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.190277 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-46tqv"] Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.211673 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6c9nl"] Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.232404 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dkdht"] Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.253303 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9zkpj"] Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.328779 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-t4kd2"] Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.343566 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.476682 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.557183 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.566121 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" event={"ID":"da9750a0-8f17-4cf7-9935-5da9c43a9a48","Type":"ContainerStarted","Data":"9920a73b71161a0427fd4e58d516856f5d92d178f00d18b9da1cdfd44dc0de0b"} Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.569521 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6kskl" event={"ID":"b7fe1c6e-0858-479f-b365-081a1b8fcf2d","Type":"ContainerStarted","Data":"1ab7c073dc13824bb150dc0eeb0af2511ed4618d4a5cff972d84bd32055f5b4e"} Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.595452 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc98d77b-bdf3-4a3b-bfad-95ef146a731e","Type":"ContainerStarted","Data":"d8b92462429d341fb7fc009c88578796c99adfb9b32537691c30289cca7cdab9"} Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.609864 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-t4kd2" event={"ID":"a0548f63-8249-4708-88d9-b3f663b28778","Type":"ContainerStarted","Data":"76ca29269c6439ffd5817368ade8500ca4d0ecf77684033d3cbbf8029b6ed3b4"} Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.621635 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-46tqv" event={"ID":"6748bae8-dcab-4fdb-ab49-b60893908a7f","Type":"ContainerStarted","Data":"8b57b201de81879e978a54221bce597132b6e15ab5017570185d67e7f3f865d1"} Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.623501 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8crdx" event={"ID":"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47","Type":"ContainerStarted","Data":"4f6a107943ba05d7abfd1d86db3620d357bbd123074c5780835a11f8d8596eee"} Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.660448 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" event={"ID":"d06bdedd-3c56-4294-9871-68021a8504d0","Type":"ContainerStarted","Data":"536221b485a54d7a959f35671e420005f5490db2fe790da3a22afefbdc3713b3"} Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.666610 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.682647 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6c9nl" event={"ID":"f4afaeee-72ae-4c47-b842-d201151915c4","Type":"ContainerStarted","Data":"e2d5e5ab23149a97e4d792533a5b619b5a5dd3940bdc1a9bc9a4c9ee4a9b7e22"} Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.688807 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" event={"ID":"ca156aa5-6d21-4bae-9a50-7df08a4ee1fd","Type":"ContainerDied","Data":"2cdb37fd8dab2b3966aa233fccd4166d8ebcb41df469866058c7445ea5f3cc92"} Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.688873 4914 scope.go:117] "RemoveContainer" containerID="722b6e0a09c5c92aeec61dd241bf9a6841a9d073a4757595813def54d8869400" Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.689024 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-2bfvl" Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.695441 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9zkpj" event={"ID":"25c531db-a02c-477b-b968-2f086a8443e8","Type":"ContainerStarted","Data":"63f7c6465f10fc5d2fc21953c7c04443ef40c5f76492dbd0cb67764da0d99b38"} Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.697974 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8crdx" podStartSLOduration=3.697952207 podStartE2EDuration="3.697952207s" podCreationTimestamp="2026-01-30 21:33:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:48.650254548 +0000 UTC m=+1162.088891309" watchObservedRunningTime="2026-01-30 21:33:48.697952207 +0000 UTC m=+1162.136588968" Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.737419 4914 scope.go:117] "RemoveContainer" containerID="32c04cef9dac91aa57e6fbe983e6787c29652d7bd3c2f31f5f5a2a465259fdf1" Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.742236 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-2bfvl"] Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.771036 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-2bfvl"] Jan 30 21:33:48 crc kubenswrapper[4914]: I0130 21:33:48.795811 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:33:49 crc kubenswrapper[4914]: I0130 21:33:49.713509 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1f9dd155-1062-4e85-97ec-405d37ba8d47","Type":"ContainerStarted","Data":"32c07f4cb361742c9766ea672893e386cd230f9653d3990f67421ae9a57fc428"} Jan 30 21:33:49 crc kubenswrapper[4914]: I0130 21:33:49.717666 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-46tqv" event={"ID":"6748bae8-dcab-4fdb-ab49-b60893908a7f","Type":"ContainerStarted","Data":"7aa1e887ff1d5b5fa539da14db28a32b20602da026fc610316282f7c7f00dfdf"} Jan 30 21:33:49 crc kubenswrapper[4914]: I0130 21:33:49.720231 4914 generic.go:334] "Generic (PLEG): container finished" podID="d06bdedd-3c56-4294-9871-68021a8504d0" containerID="64be5ce9292635356e30d42b98ac8f9dabaab33bb9d9744ef33f6f14b683f979" exitCode=0 Jan 30 21:33:49 crc kubenswrapper[4914]: I0130 21:33:49.720284 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" event={"ID":"d06bdedd-3c56-4294-9871-68021a8504d0","Type":"ContainerDied","Data":"64be5ce9292635356e30d42b98ac8f9dabaab33bb9d9744ef33f6f14b683f979"} Jan 30 21:33:49 crc kubenswrapper[4914]: I0130 21:33:49.721993 4914 generic.go:334] "Generic (PLEG): container finished" podID="da9750a0-8f17-4cf7-9935-5da9c43a9a48" containerID="f5816a4859663460f7cf9d1486d797c11fbb2351964b9c0f7e063de33f91fb98" exitCode=0 Jan 30 21:33:49 crc kubenswrapper[4914]: I0130 21:33:49.722103 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" event={"ID":"da9750a0-8f17-4cf7-9935-5da9c43a9a48","Type":"ContainerDied","Data":"f5816a4859663460f7cf9d1486d797c11fbb2351964b9c0f7e063de33f91fb98"} Jan 30 21:33:49 crc kubenswrapper[4914]: I0130 21:33:49.734789 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-46tqv" podStartSLOduration=3.734763656 podStartE2EDuration="3.734763656s" podCreationTimestamp="2026-01-30 21:33:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:49.731941488 +0000 UTC m=+1163.170578259" watchObservedRunningTime="2026-01-30 21:33:49.734763656 +0000 UTC m=+1163.173400437" Jan 30 21:33:49 crc kubenswrapper[4914]: I0130 21:33:49.877786 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" path="/var/lib/kubelet/pods/ca156aa5-6d21-4bae-9a50-7df08a4ee1fd/volumes" Jan 30 21:33:49 crc kubenswrapper[4914]: I0130 21:33:49.944603 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:33:50 crc kubenswrapper[4914]: W0130 21:33:50.014860 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb34dafa_3ea1_4548_a9a1_0b496152504c.slice/crio-9842c826513a9c340c91e29ac7cab11c6da8dff175dfe64c72815a3c93d8c754 WatchSource:0}: Error finding container 9842c826513a9c340c91e29ac7cab11c6da8dff175dfe64c72815a3c93d8c754: Status 404 returned error can't find the container with id 9842c826513a9c340c91e29ac7cab11c6da8dff175dfe64c72815a3c93d8c754 Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.321947 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.471147 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-dns-svc\") pod \"d06bdedd-3c56-4294-9871-68021a8504d0\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.471239 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-dns-swift-storage-0\") pod \"d06bdedd-3c56-4294-9871-68021a8504d0\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.471291 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-ovsdbserver-nb\") pod \"d06bdedd-3c56-4294-9871-68021a8504d0\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.471330 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-ovsdbserver-sb\") pod \"d06bdedd-3c56-4294-9871-68021a8504d0\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.471513 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md5nj\" (UniqueName: \"kubernetes.io/projected/d06bdedd-3c56-4294-9871-68021a8504d0-kube-api-access-md5nj\") pod \"d06bdedd-3c56-4294-9871-68021a8504d0\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.471541 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-config\") pod \"d06bdedd-3c56-4294-9871-68021a8504d0\" (UID: \"d06bdedd-3c56-4294-9871-68021a8504d0\") " Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.476732 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06bdedd-3c56-4294-9871-68021a8504d0-kube-api-access-md5nj" (OuterVolumeSpecName: "kube-api-access-md5nj") pod "d06bdedd-3c56-4294-9871-68021a8504d0" (UID: "d06bdedd-3c56-4294-9871-68021a8504d0"). InnerVolumeSpecName "kube-api-access-md5nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.495415 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d06bdedd-3c56-4294-9871-68021a8504d0" (UID: "d06bdedd-3c56-4294-9871-68021a8504d0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.522888 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d06bdedd-3c56-4294-9871-68021a8504d0" (UID: "d06bdedd-3c56-4294-9871-68021a8504d0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.530657 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-config" (OuterVolumeSpecName: "config") pod "d06bdedd-3c56-4294-9871-68021a8504d0" (UID: "d06bdedd-3c56-4294-9871-68021a8504d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.531365 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d06bdedd-3c56-4294-9871-68021a8504d0" (UID: "d06bdedd-3c56-4294-9871-68021a8504d0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.535531 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d06bdedd-3c56-4294-9871-68021a8504d0" (UID: "d06bdedd-3c56-4294-9871-68021a8504d0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.575639 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.575669 4914 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.575680 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.575689 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.575718 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md5nj\" (UniqueName: \"kubernetes.io/projected/d06bdedd-3c56-4294-9871-68021a8504d0-kube-api-access-md5nj\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.575727 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d06bdedd-3c56-4294-9871-68021a8504d0-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.736077 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1f9dd155-1062-4e85-97ec-405d37ba8d47","Type":"ContainerStarted","Data":"32e67dbd6993ed9f230801ebd2a0ff305cbdc4f4b63dedf77f3cf98ff69f8415"} Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.740093 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb34dafa-3ea1-4548-a9a1-0b496152504c","Type":"ContainerStarted","Data":"9842c826513a9c340c91e29ac7cab11c6da8dff175dfe64c72815a3c93d8c754"} Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.743551 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.744890 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-hqnfc" event={"ID":"d06bdedd-3c56-4294-9871-68021a8504d0","Type":"ContainerDied","Data":"536221b485a54d7a959f35671e420005f5490db2fe790da3a22afefbdc3713b3"} Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.744924 4914 scope.go:117] "RemoveContainer" containerID="64be5ce9292635356e30d42b98ac8f9dabaab33bb9d9744ef33f6f14b683f979" Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.830158 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-hqnfc"] Jan 30 21:33:50 crc kubenswrapper[4914]: I0130 21:33:50.837347 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-hqnfc"] Jan 30 21:33:51 crc kubenswrapper[4914]: I0130 21:33:51.779340 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" event={"ID":"da9750a0-8f17-4cf7-9935-5da9c43a9a48","Type":"ContainerStarted","Data":"1f86fcfd7a4a23f8ba0d8d72a74de48d2707adc205ba97def3888baccf6077ee"} Jan 30 21:33:51 crc kubenswrapper[4914]: I0130 21:33:51.831456 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06bdedd-3c56-4294-9871-68021a8504d0" path="/var/lib/kubelet/pods/d06bdedd-3c56-4294-9871-68021a8504d0/volumes" Jan 30 21:33:51 crc kubenswrapper[4914]: I0130 21:33:51.946898 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:51 crc kubenswrapper[4914]: I0130 21:33:51.953479 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:52 crc kubenswrapper[4914]: I0130 21:33:52.882050 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1f9dd155-1062-4e85-97ec-405d37ba8d47","Type":"ContainerStarted","Data":"79811c46113ea0b2acc17b0ee28e138c8ea1a34dbced25d83dd189761c6b996c"} Jan 30 21:33:52 crc kubenswrapper[4914]: I0130 21:33:52.882489 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1f9dd155-1062-4e85-97ec-405d37ba8d47" containerName="glance-log" containerID="cri-o://32e67dbd6993ed9f230801ebd2a0ff305cbdc4f4b63dedf77f3cf98ff69f8415" gracePeriod=30 Jan 30 21:33:52 crc kubenswrapper[4914]: I0130 21:33:52.883018 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1f9dd155-1062-4e85-97ec-405d37ba8d47" containerName="glance-httpd" containerID="cri-o://79811c46113ea0b2acc17b0ee28e138c8ea1a34dbced25d83dd189761c6b996c" gracePeriod=30 Jan 30 21:33:52 crc kubenswrapper[4914]: I0130 21:33:52.899944 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb34dafa-3ea1-4548-a9a1-0b496152504c","Type":"ContainerStarted","Data":"759fd4b17714ab8c8729cab1e9ce33c4d68a38b14fb06b2a88375d1a5cd9ec62"} Jan 30 21:33:52 crc kubenswrapper[4914]: I0130 21:33:52.900423 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:52 crc kubenswrapper[4914]: I0130 21:33:52.909274 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 30 21:33:52 crc kubenswrapper[4914]: I0130 21:33:52.915805 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.915789812 podStartE2EDuration="7.915789812s" podCreationTimestamp="2026-01-30 21:33:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:52.909619073 +0000 UTC m=+1166.348255834" watchObservedRunningTime="2026-01-30 21:33:52.915789812 +0000 UTC m=+1166.354426573" Jan 30 21:33:52 crc kubenswrapper[4914]: I0130 21:33:52.990773 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" podStartSLOduration=6.990755787 podStartE2EDuration="6.990755787s" podCreationTimestamp="2026-01-30 21:33:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:52.988996664 +0000 UTC m=+1166.427633445" watchObservedRunningTime="2026-01-30 21:33:52.990755787 +0000 UTC m=+1166.429392548" Jan 30 21:33:53 crc kubenswrapper[4914]: I0130 21:33:53.915752 4914 generic.go:334] "Generic (PLEG): container finished" podID="1f9dd155-1062-4e85-97ec-405d37ba8d47" containerID="32e67dbd6993ed9f230801ebd2a0ff305cbdc4f4b63dedf77f3cf98ff69f8415" exitCode=143 Jan 30 21:33:53 crc kubenswrapper[4914]: I0130 21:33:53.916868 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1f9dd155-1062-4e85-97ec-405d37ba8d47","Type":"ContainerDied","Data":"32e67dbd6993ed9f230801ebd2a0ff305cbdc4f4b63dedf77f3cf98ff69f8415"} Jan 30 21:33:54 crc kubenswrapper[4914]: I0130 21:33:54.929416 4914 generic.go:334] "Generic (PLEG): container finished" podID="1f9dd155-1062-4e85-97ec-405d37ba8d47" containerID="79811c46113ea0b2acc17b0ee28e138c8ea1a34dbced25d83dd189761c6b996c" exitCode=0 Jan 30 21:33:54 crc kubenswrapper[4914]: I0130 21:33:54.929485 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1f9dd155-1062-4e85-97ec-405d37ba8d47","Type":"ContainerDied","Data":"79811c46113ea0b2acc17b0ee28e138c8ea1a34dbced25d83dd189761c6b996c"} Jan 30 21:33:54 crc kubenswrapper[4914]: I0130 21:33:54.932591 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb34dafa-3ea1-4548-a9a1-0b496152504c","Type":"ContainerStarted","Data":"a418116e174ca741d33d55cf48b3b244286ee248d5b324721610e85f6b017f53"} Jan 30 21:33:55 crc kubenswrapper[4914]: I0130 21:33:55.943990 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cb34dafa-3ea1-4548-a9a1-0b496152504c" containerName="glance-log" containerID="cri-o://759fd4b17714ab8c8729cab1e9ce33c4d68a38b14fb06b2a88375d1a5cd9ec62" gracePeriod=30 Jan 30 21:33:55 crc kubenswrapper[4914]: I0130 21:33:55.944116 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cb34dafa-3ea1-4548-a9a1-0b496152504c" containerName="glance-httpd" containerID="cri-o://a418116e174ca741d33d55cf48b3b244286ee248d5b324721610e85f6b017f53" gracePeriod=30 Jan 30 21:33:55 crc kubenswrapper[4914]: I0130 21:33:55.973547 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.97353293 podStartE2EDuration="10.97353293s" podCreationTimestamp="2026-01-30 21:33:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:33:55.967049864 +0000 UTC m=+1169.405686635" watchObservedRunningTime="2026-01-30 21:33:55.97353293 +0000 UTC m=+1169.412169691" Jan 30 21:33:56 crc kubenswrapper[4914]: I0130 21:33:56.817869 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:33:56 crc kubenswrapper[4914]: I0130 21:33:56.879456 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5g9tf"] Jan 30 21:33:56 crc kubenswrapper[4914]: I0130 21:33:56.879967 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-5g9tf" podUID="9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" containerName="dnsmasq-dns" containerID="cri-o://58d2ae20f52406209286753e4fda81fb98788d6f3f90414533d61c2e6ecab008" gracePeriod=10 Jan 30 21:33:56 crc kubenswrapper[4914]: I0130 21:33:56.963218 4914 generic.go:334] "Generic (PLEG): container finished" podID="cb34dafa-3ea1-4548-a9a1-0b496152504c" containerID="759fd4b17714ab8c8729cab1e9ce33c4d68a38b14fb06b2a88375d1a5cd9ec62" exitCode=143 Jan 30 21:33:56 crc kubenswrapper[4914]: I0130 21:33:56.963262 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb34dafa-3ea1-4548-a9a1-0b496152504c","Type":"ContainerDied","Data":"759fd4b17714ab8c8729cab1e9ce33c4d68a38b14fb06b2a88375d1a5cd9ec62"} Jan 30 21:33:58 crc kubenswrapper[4914]: I0130 21:33:58.359971 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-5g9tf" podUID="9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Jan 30 21:33:58 crc kubenswrapper[4914]: I0130 21:33:58.987194 4914 generic.go:334] "Generic (PLEG): container finished" podID="cb34dafa-3ea1-4548-a9a1-0b496152504c" containerID="a418116e174ca741d33d55cf48b3b244286ee248d5b324721610e85f6b017f53" exitCode=0 Jan 30 21:33:58 crc kubenswrapper[4914]: I0130 21:33:58.987270 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb34dafa-3ea1-4548-a9a1-0b496152504c","Type":"ContainerDied","Data":"a418116e174ca741d33d55cf48b3b244286ee248d5b324721610e85f6b017f53"} Jan 30 21:33:58 crc kubenswrapper[4914]: I0130 21:33:58.989407 4914 generic.go:334] "Generic (PLEG): container finished" podID="9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" containerID="58d2ae20f52406209286753e4fda81fb98788d6f3f90414533d61c2e6ecab008" exitCode=0 Jan 30 21:33:58 crc kubenswrapper[4914]: I0130 21:33:58.989445 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5g9tf" event={"ID":"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe","Type":"ContainerDied","Data":"58d2ae20f52406209286753e4fda81fb98788d6f3f90414533d61c2e6ecab008"} Jan 30 21:34:02 crc kubenswrapper[4914]: I0130 21:34:02.022347 4914 generic.go:334] "Generic (PLEG): container finished" podID="0fbf2f8a-cd4a-4981-8e83-883fe1b43a47" containerID="4f6a107943ba05d7abfd1d86db3620d357bbd123074c5780835a11f8d8596eee" exitCode=0 Jan 30 21:34:02 crc kubenswrapper[4914]: I0130 21:34:02.022844 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8crdx" event={"ID":"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47","Type":"ContainerDied","Data":"4f6a107943ba05d7abfd1d86db3620d357bbd123074c5780835a11f8d8596eee"} Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.536948 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.548661 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.553609 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.654880 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") pod \"1f9dd155-1062-4e85-97ec-405d37ba8d47\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.654924 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-scripts\") pod \"1f9dd155-1062-4e85-97ec-405d37ba8d47\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655059 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-combined-ca-bundle\") pod \"1f9dd155-1062-4e85-97ec-405d37ba8d47\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655092 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw6z8\" (UniqueName: \"kubernetes.io/projected/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-kube-api-access-zw6z8\") pod \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655113 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-public-tls-certs\") pod \"1f9dd155-1062-4e85-97ec-405d37ba8d47\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655132 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j88rg\" (UniqueName: \"kubernetes.io/projected/1f9dd155-1062-4e85-97ec-405d37ba8d47-kube-api-access-j88rg\") pod \"1f9dd155-1062-4e85-97ec-405d37ba8d47\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655154 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-config\") pod \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655207 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-config-data\") pod \"1f9dd155-1062-4e85-97ec-405d37ba8d47\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655233 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f9dd155-1062-4e85-97ec-405d37ba8d47-logs\") pod \"1f9dd155-1062-4e85-97ec-405d37ba8d47\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655257 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-fernet-keys\") pod \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655300 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-scripts\") pod \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655362 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-combined-ca-bundle\") pod \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655377 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1f9dd155-1062-4e85-97ec-405d37ba8d47-httpd-run\") pod \"1f9dd155-1062-4e85-97ec-405d37ba8d47\" (UID: \"1f9dd155-1062-4e85-97ec-405d37ba8d47\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655405 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-ovsdbserver-nb\") pod \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655453 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-ovsdbserver-sb\") pod \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655482 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlfcb\" (UniqueName: \"kubernetes.io/projected/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-kube-api-access-vlfcb\") pod \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655798 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f9dd155-1062-4e85-97ec-405d37ba8d47-logs" (OuterVolumeSpecName: "logs") pod "1f9dd155-1062-4e85-97ec-405d37ba8d47" (UID: "1f9dd155-1062-4e85-97ec-405d37ba8d47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655836 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-dns-svc\") pod \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\" (UID: \"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.655993 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-config-data\") pod \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.656055 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-credential-keys\") pod \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\" (UID: \"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47\") " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.657165 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f9dd155-1062-4e85-97ec-405d37ba8d47-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.669475 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f9dd155-1062-4e85-97ec-405d37ba8d47-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1f9dd155-1062-4e85-97ec-405d37ba8d47" (UID: "1f9dd155-1062-4e85-97ec-405d37ba8d47"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.677209 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-kube-api-access-zw6z8" (OuterVolumeSpecName: "kube-api-access-zw6z8") pod "0fbf2f8a-cd4a-4981-8e83-883fe1b43a47" (UID: "0fbf2f8a-cd4a-4981-8e83-883fe1b43a47"). InnerVolumeSpecName "kube-api-access-zw6z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.677551 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-scripts" (OuterVolumeSpecName: "scripts") pod "1f9dd155-1062-4e85-97ec-405d37ba8d47" (UID: "1f9dd155-1062-4e85-97ec-405d37ba8d47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.677633 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f9dd155-1062-4e85-97ec-405d37ba8d47-kube-api-access-j88rg" (OuterVolumeSpecName: "kube-api-access-j88rg") pod "1f9dd155-1062-4e85-97ec-405d37ba8d47" (UID: "1f9dd155-1062-4e85-97ec-405d37ba8d47"). InnerVolumeSpecName "kube-api-access-j88rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.680649 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0fbf2f8a-cd4a-4981-8e83-883fe1b43a47" (UID: "0fbf2f8a-cd4a-4981-8e83-883fe1b43a47"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.716880 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-scripts" (OuterVolumeSpecName: "scripts") pod "0fbf2f8a-cd4a-4981-8e83-883fe1b43a47" (UID: "0fbf2f8a-cd4a-4981-8e83-883fe1b43a47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.726080 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-kube-api-access-vlfcb" (OuterVolumeSpecName: "kube-api-access-vlfcb") pod "9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" (UID: "9a72c047-4bec-41a0-bcd6-6002e9fb8dbe"). InnerVolumeSpecName "kube-api-access-vlfcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.726095 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0fbf2f8a-cd4a-4981-8e83-883fe1b43a47" (UID: "0fbf2f8a-cd4a-4981-8e83-883fe1b43a47"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.739511 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-config-data" (OuterVolumeSpecName: "config-data") pod "0fbf2f8a-cd4a-4981-8e83-883fe1b43a47" (UID: "0fbf2f8a-cd4a-4981-8e83-883fe1b43a47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.763551 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw6z8\" (UniqueName: \"kubernetes.io/projected/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-kube-api-access-zw6z8\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.763591 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j88rg\" (UniqueName: \"kubernetes.io/projected/1f9dd155-1062-4e85-97ec-405d37ba8d47-kube-api-access-j88rg\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.763605 4914 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.763617 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.763629 4914 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1f9dd155-1062-4e85-97ec-405d37ba8d47-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.763640 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlfcb\" (UniqueName: \"kubernetes.io/projected/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-kube-api-access-vlfcb\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.763652 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.763665 4914 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.763676 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.770120 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fbf2f8a-cd4a-4981-8e83-883fe1b43a47" (UID: "0fbf2f8a-cd4a-4981-8e83-883fe1b43a47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.774187 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f9dd155-1062-4e85-97ec-405d37ba8d47" (UID: "1f9dd155-1062-4e85-97ec-405d37ba8d47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.789120 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23" (OuterVolumeSpecName: "glance") pod "1f9dd155-1062-4e85-97ec-405d37ba8d47" (UID: "1f9dd155-1062-4e85-97ec-405d37ba8d47"). InnerVolumeSpecName "pvc-581e81f1-e0f1-495b-b12a-80feb1423c23". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.821504 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" (UID: "9a72c047-4bec-41a0-bcd6-6002e9fb8dbe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.847124 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" (UID: "9a72c047-4bec-41a0-bcd6-6002e9fb8dbe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.852932 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-config" (OuterVolumeSpecName: "config") pod "9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" (UID: "9a72c047-4bec-41a0-bcd6-6002e9fb8dbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.859476 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1f9dd155-1062-4e85-97ec-405d37ba8d47" (UID: "1f9dd155-1062-4e85-97ec-405d37ba8d47"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.865643 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.865667 4914 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.865678 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.865686 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.865696 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.865716 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.865735 4914 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") on node \"crc\" " Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.882232 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" (UID: "9a72c047-4bec-41a0-bcd6-6002e9fb8dbe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.888664 4914 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.888833 4914 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-581e81f1-e0f1-495b-b12a-80feb1423c23" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23") on node "crc" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.893286 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-config-data" (OuterVolumeSpecName: "config-data") pod "1f9dd155-1062-4e85-97ec-405d37ba8d47" (UID: "1f9dd155-1062-4e85-97ec-405d37ba8d47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.967910 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.967949 4914 reconciler_common.go:293] "Volume detached for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:07 crc kubenswrapper[4914]: I0130 21:34:07.967963 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9dd155-1062-4e85-97ec-405d37ba8d47-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.079773 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1f9dd155-1062-4e85-97ec-405d37ba8d47","Type":"ContainerDied","Data":"32c07f4cb361742c9766ea672893e386cd230f9653d3990f67421ae9a57fc428"} Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.079794 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.079828 4914 scope.go:117] "RemoveContainer" containerID="79811c46113ea0b2acc17b0ee28e138c8ea1a34dbced25d83dd189761c6b996c" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.081634 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5g9tf" event={"ID":"9a72c047-4bec-41a0-bcd6-6002e9fb8dbe","Type":"ContainerDied","Data":"956e047291c1480d721f1809a6e625fa125c395fe2436f8b76d532410a234118"} Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.081728 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5g9tf" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.083789 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8crdx" event={"ID":"0fbf2f8a-cd4a-4981-8e83-883fe1b43a47","Type":"ContainerDied","Data":"f80df3240efe68ee921ef944a01dd5c5d68a4200f079a14cb6ee0a4ee18a8b6f"} Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.083830 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f80df3240efe68ee921ef944a01dd5c5d68a4200f079a14cb6ee0a4ee18a8b6f" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.083847 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8crdx" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.121120 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.134080 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.147884 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5g9tf"] Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.161692 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5g9tf"] Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.172859 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:34:08 crc kubenswrapper[4914]: E0130 21:34:08.173296 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" containerName="dnsmasq-dns" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.173318 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" containerName="dnsmasq-dns" Jan 30 21:34:08 crc kubenswrapper[4914]: E0130 21:34:08.173337 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" containerName="init" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.173343 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" containerName="init" Jan 30 21:34:08 crc kubenswrapper[4914]: E0130 21:34:08.173364 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fbf2f8a-cd4a-4981-8e83-883fe1b43a47" containerName="keystone-bootstrap" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.173371 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbf2f8a-cd4a-4981-8e83-883fe1b43a47" containerName="keystone-bootstrap" Jan 30 21:34:08 crc kubenswrapper[4914]: E0130 21:34:08.173388 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" containerName="init" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.173394 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" containerName="init" Jan 30 21:34:08 crc kubenswrapper[4914]: E0130 21:34:08.173405 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9dd155-1062-4e85-97ec-405d37ba8d47" containerName="glance-httpd" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.173411 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9dd155-1062-4e85-97ec-405d37ba8d47" containerName="glance-httpd" Jan 30 21:34:08 crc kubenswrapper[4914]: E0130 21:34:08.173422 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9dd155-1062-4e85-97ec-405d37ba8d47" containerName="glance-log" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.173427 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9dd155-1062-4e85-97ec-405d37ba8d47" containerName="glance-log" Jan 30 21:34:08 crc kubenswrapper[4914]: E0130 21:34:08.173434 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06bdedd-3c56-4294-9871-68021a8504d0" containerName="init" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.173439 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06bdedd-3c56-4294-9871-68021a8504d0" containerName="init" Jan 30 21:34:08 crc kubenswrapper[4914]: E0130 21:34:08.173455 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" containerName="dnsmasq-dns" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.173460 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" containerName="dnsmasq-dns" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.173650 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fbf2f8a-cd4a-4981-8e83-883fe1b43a47" containerName="keystone-bootstrap" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.173671 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f9dd155-1062-4e85-97ec-405d37ba8d47" containerName="glance-httpd" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.173682 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca156aa5-6d21-4bae-9a50-7df08a4ee1fd" containerName="dnsmasq-dns" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.173692 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" containerName="dnsmasq-dns" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.173723 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f9dd155-1062-4e85-97ec-405d37ba8d47" containerName="glance-log" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.173731 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06bdedd-3c56-4294-9871-68021a8504d0" containerName="init" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.174750 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.182726 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.183012 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.187995 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.275287 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-scripts\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.275348 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.275404 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.275434 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7939da09-12d4-4b76-9664-cd12cfd93f72-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.275538 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.275590 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7939da09-12d4-4b76-9664-cd12cfd93f72-logs\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.275616 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-config-data\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.275670 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwq27\" (UniqueName: \"kubernetes.io/projected/7939da09-12d4-4b76-9664-cd12cfd93f72-kube-api-access-qwq27\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.359930 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-5g9tf" podUID="9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.376969 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7939da09-12d4-4b76-9664-cd12cfd93f72-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.377100 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.377150 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7939da09-12d4-4b76-9664-cd12cfd93f72-logs\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.377184 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-config-data\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.377231 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwq27\" (UniqueName: \"kubernetes.io/projected/7939da09-12d4-4b76-9664-cd12cfd93f72-kube-api-access-qwq27\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.377293 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-scripts\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.377342 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.377372 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.377471 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7939da09-12d4-4b76-9664-cd12cfd93f72-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.377484 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7939da09-12d4-4b76-9664-cd12cfd93f72-logs\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.379211 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.379239 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b72a388fd97a6544e59fcf27848209fccbec8612f776f09e5be6189768241719/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.381845 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.382140 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.388286 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-scripts\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.388953 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-config-data\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.395582 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwq27\" (UniqueName: \"kubernetes.io/projected/7939da09-12d4-4b76-9664-cd12cfd93f72-kube-api-access-qwq27\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.434671 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") pod \"glance-default-external-api-0\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.507219 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.683436 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8crdx"] Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.695091 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8crdx"] Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.812032 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5jxqn"] Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.813528 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5jxqn"] Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.813607 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.823524 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.823665 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tzbth" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.823739 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.824013 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.892883 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-scripts\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.892960 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-credential-keys\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.893166 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8lbc\" (UniqueName: \"kubernetes.io/projected/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-kube-api-access-l8lbc\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.893218 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-fernet-keys\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.893253 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-combined-ca-bundle\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.893414 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-config-data\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.994794 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8lbc\" (UniqueName: \"kubernetes.io/projected/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-kube-api-access-l8lbc\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.994850 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-fernet-keys\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.994878 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-combined-ca-bundle\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.994951 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-config-data\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.995614 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-scripts\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:08 crc kubenswrapper[4914]: I0130 21:34:08.995640 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-credential-keys\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:09 crc kubenswrapper[4914]: I0130 21:34:09.000426 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-fernet-keys\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:09 crc kubenswrapper[4914]: I0130 21:34:09.001093 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-combined-ca-bundle\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:09 crc kubenswrapper[4914]: I0130 21:34:09.001958 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-scripts\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:09 crc kubenswrapper[4914]: I0130 21:34:09.009646 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-config-data\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:09 crc kubenswrapper[4914]: I0130 21:34:09.011307 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-credential-keys\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:09 crc kubenswrapper[4914]: I0130 21:34:09.011831 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8lbc\" (UniqueName: \"kubernetes.io/projected/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-kube-api-access-l8lbc\") pod \"keystone-bootstrap-5jxqn\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:09 crc kubenswrapper[4914]: I0130 21:34:09.139579 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:09 crc kubenswrapper[4914]: I0130 21:34:09.844261 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fbf2f8a-cd4a-4981-8e83-883fe1b43a47" path="/var/lib/kubelet/pods/0fbf2f8a-cd4a-4981-8e83-883fe1b43a47/volumes" Jan 30 21:34:09 crc kubenswrapper[4914]: I0130 21:34:09.846225 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f9dd155-1062-4e85-97ec-405d37ba8d47" path="/var/lib/kubelet/pods/1f9dd155-1062-4e85-97ec-405d37ba8d47/volumes" Jan 30 21:34:09 crc kubenswrapper[4914]: I0130 21:34:09.848151 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a72c047-4bec-41a0-bcd6-6002e9fb8dbe" path="/var/lib/kubelet/pods/9a72c047-4bec-41a0-bcd6-6002e9fb8dbe/volumes" Jan 30 21:34:17 crc kubenswrapper[4914]: I0130 21:34:17.511336 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:17 crc kubenswrapper[4914]: I0130 21:34:17.512131 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.311965 4914 scope.go:117] "RemoveContainer" containerID="32e67dbd6993ed9f230801ebd2a0ff305cbdc4f4b63dedf77f3cf98ff69f8415" Jan 30 21:34:23 crc kubenswrapper[4914]: E0130 21:34:23.320681 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 30 21:34:23 crc kubenswrapper[4914]: E0130 21:34:23.320873 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqvqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-6c9nl_openstack(f4afaeee-72ae-4c47-b842-d201151915c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:34:23 crc kubenswrapper[4914]: E0130 21:34:23.322139 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-6c9nl" podUID="f4afaeee-72ae-4c47-b842-d201151915c4" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.426737 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.595140 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-internal-tls-certs\") pod \"cb34dafa-3ea1-4548-a9a1-0b496152504c\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.595251 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-combined-ca-bundle\") pod \"cb34dafa-3ea1-4548-a9a1-0b496152504c\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.595298 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb34dafa-3ea1-4548-a9a1-0b496152504c-httpd-run\") pod \"cb34dafa-3ea1-4548-a9a1-0b496152504c\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.595404 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") pod \"cb34dafa-3ea1-4548-a9a1-0b496152504c\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.595437 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-scripts\") pod \"cb34dafa-3ea1-4548-a9a1-0b496152504c\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.595460 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc6z4\" (UniqueName: \"kubernetes.io/projected/cb34dafa-3ea1-4548-a9a1-0b496152504c-kube-api-access-vc6z4\") pod \"cb34dafa-3ea1-4548-a9a1-0b496152504c\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.595482 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-config-data\") pod \"cb34dafa-3ea1-4548-a9a1-0b496152504c\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.595498 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb34dafa-3ea1-4548-a9a1-0b496152504c-logs\") pod \"cb34dafa-3ea1-4548-a9a1-0b496152504c\" (UID: \"cb34dafa-3ea1-4548-a9a1-0b496152504c\") " Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.596041 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb34dafa-3ea1-4548-a9a1-0b496152504c-logs" (OuterVolumeSpecName: "logs") pod "cb34dafa-3ea1-4548-a9a1-0b496152504c" (UID: "cb34dafa-3ea1-4548-a9a1-0b496152504c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.596160 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb34dafa-3ea1-4548-a9a1-0b496152504c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cb34dafa-3ea1-4548-a9a1-0b496152504c" (UID: "cb34dafa-3ea1-4548-a9a1-0b496152504c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.600037 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-scripts" (OuterVolumeSpecName: "scripts") pod "cb34dafa-3ea1-4548-a9a1-0b496152504c" (UID: "cb34dafa-3ea1-4548-a9a1-0b496152504c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.601747 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb34dafa-3ea1-4548-a9a1-0b496152504c-kube-api-access-vc6z4" (OuterVolumeSpecName: "kube-api-access-vc6z4") pod "cb34dafa-3ea1-4548-a9a1-0b496152504c" (UID: "cb34dafa-3ea1-4548-a9a1-0b496152504c"). InnerVolumeSpecName "kube-api-access-vc6z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.617085 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1" (OuterVolumeSpecName: "glance") pod "cb34dafa-3ea1-4548-a9a1-0b496152504c" (UID: "cb34dafa-3ea1-4548-a9a1-0b496152504c"). InnerVolumeSpecName "pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.630391 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb34dafa-3ea1-4548-a9a1-0b496152504c" (UID: "cb34dafa-3ea1-4548-a9a1-0b496152504c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.646630 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cb34dafa-3ea1-4548-a9a1-0b496152504c" (UID: "cb34dafa-3ea1-4548-a9a1-0b496152504c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.658851 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-config-data" (OuterVolumeSpecName: "config-data") pod "cb34dafa-3ea1-4548-a9a1-0b496152504c" (UID: "cb34dafa-3ea1-4548-a9a1-0b496152504c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.697077 4914 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") on node \"crc\" " Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.697117 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.697132 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc6z4\" (UniqueName: \"kubernetes.io/projected/cb34dafa-3ea1-4548-a9a1-0b496152504c-kube-api-access-vc6z4\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.697145 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.697158 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb34dafa-3ea1-4548-a9a1-0b496152504c-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.697168 4914 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.697181 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb34dafa-3ea1-4548-a9a1-0b496152504c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.697191 4914 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb34dafa-3ea1-4548-a9a1-0b496152504c-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.727125 4914 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.727297 4914 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1") on node "crc" Jan 30 21:34:23 crc kubenswrapper[4914]: I0130 21:34:23.799562 4914 reconciler_common.go:293] "Volume detached for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:23 crc kubenswrapper[4914]: E0130 21:34:23.843162 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Jan 30 21:34:23 crc kubenswrapper[4914]: E0130 21:34:23.843511 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n67bh9hcch5b6h689h5fdh5b4h79h67bh566h5f6h595hfdh566hfch657h67fh96h4h55ch5c4h5bfh669h68fhcfhc6hd5h5b5h58dh66bh667h5b7q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4jxxd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(dc98d77b-bdf3-4a3b-bfad-95ef146a731e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.272817 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb34dafa-3ea1-4548-a9a1-0b496152504c","Type":"ContainerDied","Data":"9842c826513a9c340c91e29ac7cab11c6da8dff175dfe64c72815a3c93d8c754"} Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.272858 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: E0130 21:34:24.274828 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-6c9nl" podUID="f4afaeee-72ae-4c47-b842-d201151915c4" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.328914 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.346512 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.363251 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:34:24 crc kubenswrapper[4914]: E0130 21:34:24.363741 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb34dafa-3ea1-4548-a9a1-0b496152504c" containerName="glance-log" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.363762 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb34dafa-3ea1-4548-a9a1-0b496152504c" containerName="glance-log" Jan 30 21:34:24 crc kubenswrapper[4914]: E0130 21:34:24.363789 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb34dafa-3ea1-4548-a9a1-0b496152504c" containerName="glance-httpd" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.363798 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb34dafa-3ea1-4548-a9a1-0b496152504c" containerName="glance-httpd" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.364043 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb34dafa-3ea1-4548-a9a1-0b496152504c" containerName="glance-log" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.364071 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb34dafa-3ea1-4548-a9a1-0b496152504c" containerName="glance-httpd" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.365297 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.367591 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.368430 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.373285 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.510883 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.510968 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.511001 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.511030 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.511058 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-logs\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.511097 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfwdc\" (UniqueName: \"kubernetes.io/projected/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-kube-api-access-jfwdc\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.511229 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.511298 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.614181 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.614288 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.614497 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.614598 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.614688 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.614818 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.614895 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-logs\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.614993 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfwdc\" (UniqueName: \"kubernetes.io/projected/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-kube-api-access-jfwdc\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.616014 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.617100 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-logs\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.619129 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.619173 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/66c50567016faa78360e7f45b700987189e7fa2a9601532760fae56b995ba54f/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.621587 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.627724 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.629363 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.631546 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfwdc\" (UniqueName: \"kubernetes.io/projected/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-kube-api-access-jfwdc\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.639205 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.657358 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") pod \"glance-default-internal-api-0\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:34:24 crc kubenswrapper[4914]: I0130 21:34:24.702268 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:25 crc kubenswrapper[4914]: E0130 21:34:25.045396 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 30 21:34:25 crc kubenswrapper[4914]: E0130 21:34:25.045600 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x48gj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6kskl_openstack(b7fe1c6e-0858-479f-b365-081a1b8fcf2d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:34:25 crc kubenswrapper[4914]: E0130 21:34:25.046809 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6kskl" podUID="b7fe1c6e-0858-479f-b365-081a1b8fcf2d" Jan 30 21:34:25 crc kubenswrapper[4914]: E0130 21:34:25.300979 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-6kskl" podUID="b7fe1c6e-0858-479f-b365-081a1b8fcf2d" Jan 30 21:34:25 crc kubenswrapper[4914]: I0130 21:34:25.843985 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb34dafa-3ea1-4548-a9a1-0b496152504c" path="/var/lib/kubelet/pods/cb34dafa-3ea1-4548-a9a1-0b496152504c/volumes" Jan 30 21:34:26 crc kubenswrapper[4914]: I0130 21:34:26.983492 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:34:26 crc kubenswrapper[4914]: I0130 21:34:26.984169 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:34:30 crc kubenswrapper[4914]: I0130 21:34:30.288671 4914 scope.go:117] "RemoveContainer" containerID="58d2ae20f52406209286753e4fda81fb98788d6f3f90414533d61c2e6ecab008" Jan 30 21:34:30 crc kubenswrapper[4914]: E0130 21:34:30.872479 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Jan 30 21:34:30 crc kubenswrapper[4914]: E0130 21:34:30.872943 4914 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Jan 30 21:34:30 crc kubenswrapper[4914]: E0130 21:34:30.873161 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-knbtb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-t4kd2_openstack(a0548f63-8249-4708-88d9-b3f663b28778): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:34:30 crc kubenswrapper[4914]: E0130 21:34:30.874779 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-t4kd2" podUID="a0548f63-8249-4708-88d9-b3f663b28778" Jan 30 21:34:31 crc kubenswrapper[4914]: I0130 21:34:31.250841 4914 scope.go:117] "RemoveContainer" containerID="dfe8b844069b1515bd0bf43bcc4923434b172851daaaceab570e16acbaea9efd" Jan 30 21:34:31 crc kubenswrapper[4914]: I0130 21:34:31.270869 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5jxqn"] Jan 30 21:34:31 crc kubenswrapper[4914]: I0130 21:34:31.478895 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:34:31 crc kubenswrapper[4914]: E0130 21:34:31.524557 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-t4kd2" podUID="a0548f63-8249-4708-88d9-b3f663b28778" Jan 30 21:34:31 crc kubenswrapper[4914]: W0130 21:34:31.532520 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bb843c7_b3dd_494f_9eb2_ccfbf2c108c4.slice/crio-e4fe214505bd98ce3f94659c8b9e25076c29cab044960be5ee82e42e8b6ee0ce WatchSource:0}: Error finding container e4fe214505bd98ce3f94659c8b9e25076c29cab044960be5ee82e42e8b6ee0ce: Status 404 returned error can't find the container with id e4fe214505bd98ce3f94659c8b9e25076c29cab044960be5ee82e42e8b6ee0ce Jan 30 21:34:31 crc kubenswrapper[4914]: I0130 21:34:31.586464 4914 scope.go:117] "RemoveContainer" containerID="a418116e174ca741d33d55cf48b3b244286ee248d5b324721610e85f6b017f53" Jan 30 21:34:31 crc kubenswrapper[4914]: I0130 21:34:31.739279 4914 scope.go:117] "RemoveContainer" containerID="759fd4b17714ab8c8729cab1e9ce33c4d68a38b14fb06b2a88375d1a5cd9ec62" Jan 30 21:34:31 crc kubenswrapper[4914]: I0130 21:34:31.792056 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:34:31 crc kubenswrapper[4914]: W0130 21:34:31.806251 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cce3e04_8dbe_4df9_aed0_45303d35e7c4.slice/crio-20b47533104844fa5ef3d36cb485fdaf4f40a6e1de9efadd6f67542d5b4e13a8 WatchSource:0}: Error finding container 20b47533104844fa5ef3d36cb485fdaf4f40a6e1de9efadd6f67542d5b4e13a8: Status 404 returned error can't find the container with id 20b47533104844fa5ef3d36cb485fdaf4f40a6e1de9efadd6f67542d5b4e13a8 Jan 30 21:34:32 crc kubenswrapper[4914]: I0130 21:34:32.380225 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cce3e04-8dbe-4df9-aed0-45303d35e7c4","Type":"ContainerStarted","Data":"20b47533104844fa5ef3d36cb485fdaf4f40a6e1de9efadd6f67542d5b4e13a8"} Jan 30 21:34:32 crc kubenswrapper[4914]: I0130 21:34:32.388108 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9zkpj" event={"ID":"25c531db-a02c-477b-b968-2f086a8443e8","Type":"ContainerStarted","Data":"9e75279ece50bf2e3425ec5f9fd516a2954e93c69c273e1bc87e22ed259d29ad"} Jan 30 21:34:32 crc kubenswrapper[4914]: I0130 21:34:32.393391 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5jxqn" event={"ID":"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4","Type":"ContainerStarted","Data":"b554b1f4e32799e7f2ed76f98ec781a2f9ac8d9c8e7cf091df07db8332c1a303"} Jan 30 21:34:32 crc kubenswrapper[4914]: I0130 21:34:32.393462 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5jxqn" event={"ID":"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4","Type":"ContainerStarted","Data":"e4fe214505bd98ce3f94659c8b9e25076c29cab044960be5ee82e42e8b6ee0ce"} Jan 30 21:34:32 crc kubenswrapper[4914]: I0130 21:34:32.402310 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7939da09-12d4-4b76-9664-cd12cfd93f72","Type":"ContainerStarted","Data":"a7cc55c554c8dffb541010b33783266b1c4829c58b3912a008414f2dd3b2b5f7"} Jan 30 21:34:32 crc kubenswrapper[4914]: I0130 21:34:32.402410 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7939da09-12d4-4b76-9664-cd12cfd93f72","Type":"ContainerStarted","Data":"744d0fe8ea320619e1016487d5236e70a95b7b1d950244af7eed866659e0ad61"} Jan 30 21:34:32 crc kubenswrapper[4914]: I0130 21:34:32.415211 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-9zkpj" podStartSLOduration=11.468038546 podStartE2EDuration="46.415189323s" podCreationTimestamp="2026-01-30 21:33:46 +0000 UTC" firstStartedPulling="2026-01-30 21:33:48.365253357 +0000 UTC m=+1161.803890118" lastFinishedPulling="2026-01-30 21:34:23.312404134 +0000 UTC m=+1196.751040895" observedRunningTime="2026-01-30 21:34:32.406411222 +0000 UTC m=+1205.845047983" watchObservedRunningTime="2026-01-30 21:34:32.415189323 +0000 UTC m=+1205.853826084" Jan 30 21:34:32 crc kubenswrapper[4914]: I0130 21:34:32.417277 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc98d77b-bdf3-4a3b-bfad-95ef146a731e","Type":"ContainerStarted","Data":"c3f7ddb56e28cb00fe58eaceb456a6dc9e7b176ecf9ccb7aa5326fa1ce9fdf5d"} Jan 30 21:34:32 crc kubenswrapper[4914]: I0130 21:34:32.432642 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5jxqn" podStartSLOduration=24.432620833 podStartE2EDuration="24.432620833s" podCreationTimestamp="2026-01-30 21:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:34:32.42294451 +0000 UTC m=+1205.861581271" watchObservedRunningTime="2026-01-30 21:34:32.432620833 +0000 UTC m=+1205.871257584" Jan 30 21:34:33 crc kubenswrapper[4914]: I0130 21:34:33.456133 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cce3e04-8dbe-4df9-aed0-45303d35e7c4","Type":"ContainerStarted","Data":"f1458a6042e2d7a23a8761ef656701b576048b24cfb4b46c29578f9bddc48ad2"} Jan 30 21:34:33 crc kubenswrapper[4914]: I0130 21:34:33.456582 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cce3e04-8dbe-4df9-aed0-45303d35e7c4","Type":"ContainerStarted","Data":"be8cb6bb42dc103d7c49626741fc2083405b05cb9dd20814432ff3d5970afe4b"} Jan 30 21:34:33 crc kubenswrapper[4914]: I0130 21:34:33.463822 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7939da09-12d4-4b76-9664-cd12cfd93f72","Type":"ContainerStarted","Data":"71ed2d7a3fc976f936122a7214c258388dbe4184523484ced9e276ec97b83734"} Jan 30 21:34:33 crc kubenswrapper[4914]: I0130 21:34:33.494816 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.494776682 podStartE2EDuration="9.494776682s" podCreationTimestamp="2026-01-30 21:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:34:33.478783367 +0000 UTC m=+1206.917420128" watchObservedRunningTime="2026-01-30 21:34:33.494776682 +0000 UTC m=+1206.933413463" Jan 30 21:34:33 crc kubenswrapper[4914]: I0130 21:34:33.515787 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=25.515766377 podStartE2EDuration="25.515766377s" podCreationTimestamp="2026-01-30 21:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:34:33.511095405 +0000 UTC m=+1206.949732166" watchObservedRunningTime="2026-01-30 21:34:33.515766377 +0000 UTC m=+1206.954403128" Jan 30 21:34:34 crc kubenswrapper[4914]: I0130 21:34:34.704360 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:34 crc kubenswrapper[4914]: I0130 21:34:34.704684 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:34 crc kubenswrapper[4914]: I0130 21:34:34.756997 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:34 crc kubenswrapper[4914]: I0130 21:34:34.778637 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:35 crc kubenswrapper[4914]: I0130 21:34:35.482997 4914 generic.go:334] "Generic (PLEG): container finished" podID="25c531db-a02c-477b-b968-2f086a8443e8" containerID="9e75279ece50bf2e3425ec5f9fd516a2954e93c69c273e1bc87e22ed259d29ad" exitCode=0 Jan 30 21:34:35 crc kubenswrapper[4914]: I0130 21:34:35.483087 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9zkpj" event={"ID":"25c531db-a02c-477b-b968-2f086a8443e8","Type":"ContainerDied","Data":"9e75279ece50bf2e3425ec5f9fd516a2954e93c69c273e1bc87e22ed259d29ad"} Jan 30 21:34:35 crc kubenswrapper[4914]: I0130 21:34:35.483246 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:35 crc kubenswrapper[4914]: I0130 21:34:35.483298 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:36 crc kubenswrapper[4914]: I0130 21:34:36.510880 4914 generic.go:334] "Generic (PLEG): container finished" podID="5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4" containerID="b554b1f4e32799e7f2ed76f98ec781a2f9ac8d9c8e7cf091df07db8332c1a303" exitCode=0 Jan 30 21:34:36 crc kubenswrapper[4914]: I0130 21:34:36.510934 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5jxqn" event={"ID":"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4","Type":"ContainerDied","Data":"b554b1f4e32799e7f2ed76f98ec781a2f9ac8d9c8e7cf091df07db8332c1a303"} Jan 30 21:34:38 crc kubenswrapper[4914]: I0130 21:34:38.507802 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 21:34:38 crc kubenswrapper[4914]: I0130 21:34:38.509499 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 21:34:38 crc kubenswrapper[4914]: I0130 21:34:38.509622 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 21:34:38 crc kubenswrapper[4914]: I0130 21:34:38.509773 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 21:34:38 crc kubenswrapper[4914]: I0130 21:34:38.560034 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 21:34:38 crc kubenswrapper[4914]: I0130 21:34:38.569808 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.046035 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9zkpj" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.057673 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.071507 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25c531db-a02c-477b-b968-2f086a8443e8-logs\") pod \"25c531db-a02c-477b-b968-2f086a8443e8\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.071603 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-combined-ca-bundle\") pod \"25c531db-a02c-477b-b968-2f086a8443e8\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.071780 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-config-data\") pod \"25c531db-a02c-477b-b968-2f086a8443e8\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.071816 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jchbv\" (UniqueName: \"kubernetes.io/projected/25c531db-a02c-477b-b968-2f086a8443e8-kube-api-access-jchbv\") pod \"25c531db-a02c-477b-b968-2f086a8443e8\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.072050 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-scripts\") pod \"25c531db-a02c-477b-b968-2f086a8443e8\" (UID: \"25c531db-a02c-477b-b968-2f086a8443e8\") " Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.076102 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25c531db-a02c-477b-b968-2f086a8443e8-logs" (OuterVolumeSpecName: "logs") pod "25c531db-a02c-477b-b968-2f086a8443e8" (UID: "25c531db-a02c-477b-b968-2f086a8443e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.081470 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25c531db-a02c-477b-b968-2f086a8443e8-kube-api-access-jchbv" (OuterVolumeSpecName: "kube-api-access-jchbv") pod "25c531db-a02c-477b-b968-2f086a8443e8" (UID: "25c531db-a02c-477b-b968-2f086a8443e8"). InnerVolumeSpecName "kube-api-access-jchbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.097318 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-scripts" (OuterVolumeSpecName: "scripts") pod "25c531db-a02c-477b-b968-2f086a8443e8" (UID: "25c531db-a02c-477b-b968-2f086a8443e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.139824 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-config-data" (OuterVolumeSpecName: "config-data") pod "25c531db-a02c-477b-b968-2f086a8443e8" (UID: "25c531db-a02c-477b-b968-2f086a8443e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.147655 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25c531db-a02c-477b-b968-2f086a8443e8" (UID: "25c531db-a02c-477b-b968-2f086a8443e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.174516 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8lbc\" (UniqueName: \"kubernetes.io/projected/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-kube-api-access-l8lbc\") pod \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.174744 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-scripts\") pod \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.174923 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-fernet-keys\") pod \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.175023 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-combined-ca-bundle\") pod \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.175113 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-config-data\") pod \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.175216 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-credential-keys\") pod \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\" (UID: \"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4\") " Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.175950 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.175987 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jchbv\" (UniqueName: \"kubernetes.io/projected/25c531db-a02c-477b-b968-2f086a8443e8-kube-api-access-jchbv\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.176008 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.176025 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25c531db-a02c-477b-b968-2f086a8443e8-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.176042 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c531db-a02c-477b-b968-2f086a8443e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.181117 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4" (UID: "5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.188096 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4" (UID: "5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.189370 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-scripts" (OuterVolumeSpecName: "scripts") pod "5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4" (UID: "5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.191862 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-kube-api-access-l8lbc" (OuterVolumeSpecName: "kube-api-access-l8lbc") pod "5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4" (UID: "5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4"). InnerVolumeSpecName "kube-api-access-l8lbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.220015 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-config-data" (OuterVolumeSpecName: "config-data") pod "5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4" (UID: "5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.250917 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4" (UID: "5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.277654 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.277699 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.277716 4914 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.277731 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8lbc\" (UniqueName: \"kubernetes.io/projected/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-kube-api-access-l8lbc\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.277760 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.277771 4914 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.569895 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9zkpj" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.569891 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9zkpj" event={"ID":"25c531db-a02c-477b-b968-2f086a8443e8","Type":"ContainerDied","Data":"63f7c6465f10fc5d2fc21953c7c04443ef40c5f76492dbd0cb67764da0d99b38"} Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.570083 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63f7c6465f10fc5d2fc21953c7c04443ef40c5f76492dbd0cb67764da0d99b38" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.571468 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5jxqn" event={"ID":"5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4","Type":"ContainerDied","Data":"e4fe214505bd98ce3f94659c8b9e25076c29cab044960be5ee82e42e8b6ee0ce"} Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.571499 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4fe214505bd98ce3f94659c8b9e25076c29cab044960be5ee82e42e8b6ee0ce" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.571558 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5jxqn" Jan 30 21:34:40 crc kubenswrapper[4914]: I0130 21:34:40.836975 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.200845 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7f67697d54-9s42z"] Jan 30 21:34:41 crc kubenswrapper[4914]: E0130 21:34:41.201243 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c531db-a02c-477b-b968-2f086a8443e8" containerName="placement-db-sync" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.201254 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c531db-a02c-477b-b968-2f086a8443e8" containerName="placement-db-sync" Jan 30 21:34:41 crc kubenswrapper[4914]: E0130 21:34:41.201271 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4" containerName="keystone-bootstrap" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.201279 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4" containerName="keystone-bootstrap" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.201464 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c531db-a02c-477b-b968-2f086a8443e8" containerName="placement-db-sync" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.201481 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4" containerName="keystone-bootstrap" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.202481 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.206575 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.206917 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.207150 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.207431 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nhc5p" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.225183 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.225431 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f67697d54-9s42z"] Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.264554 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6c7c6d9b88-rpslq"] Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.266099 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.269736 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.269886 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.269989 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.272048 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.275175 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tzbth" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.279271 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.282646 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6c7c6d9b88-rpslq"] Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.297689 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-logs\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.297751 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-credential-keys\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.297783 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-config-data\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.297812 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-fernet-keys\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.297835 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-combined-ca-bundle\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.297863 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-combined-ca-bundle\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.297882 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-internal-tls-certs\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.297917 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9nq7\" (UniqueName: \"kubernetes.io/projected/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-kube-api-access-f9nq7\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.297946 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-public-tls-certs\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.297974 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-config-data\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.297993 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-public-tls-certs\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.298011 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-internal-tls-certs\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.298031 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-scripts\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.298067 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcgjq\" (UniqueName: \"kubernetes.io/projected/c8f39fc5-1811-4872-b99a-4ba212837d75-kube-api-access-rcgjq\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.298100 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-scripts\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406204 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9nq7\" (UniqueName: \"kubernetes.io/projected/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-kube-api-access-f9nq7\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406543 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-public-tls-certs\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406586 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-config-data\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406613 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-public-tls-certs\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406637 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-internal-tls-certs\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406665 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-scripts\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406725 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcgjq\" (UniqueName: \"kubernetes.io/projected/c8f39fc5-1811-4872-b99a-4ba212837d75-kube-api-access-rcgjq\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406784 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-scripts\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406824 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-logs\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406843 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-credential-keys\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406865 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-config-data\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406894 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-fernet-keys\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406923 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-combined-ca-bundle\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406960 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-combined-ca-bundle\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.406988 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-internal-tls-certs\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.408316 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-logs\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.411918 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-public-tls-certs\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.412818 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-public-tls-certs\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.413298 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-internal-tls-certs\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.413959 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-scripts\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.414640 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-internal-tls-certs\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.414978 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-credential-keys\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.415010 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-scripts\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.415418 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-combined-ca-bundle\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.415959 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-fernet-keys\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.416867 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-combined-ca-bundle\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.418140 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f39fc5-1811-4872-b99a-4ba212837d75-config-data\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.433638 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-config-data\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.436210 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9nq7\" (UniqueName: \"kubernetes.io/projected/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-kube-api-access-f9nq7\") pod \"placement-7f67697d54-9s42z\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.437302 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcgjq\" (UniqueName: \"kubernetes.io/projected/c8f39fc5-1811-4872-b99a-4ba212837d75-kube-api-access-rcgjq\") pod \"keystone-6c7c6d9b88-rpslq\" (UID: \"c8f39fc5-1811-4872-b99a-4ba212837d75\") " pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.450760 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.527326 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.587606 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7bf9bcb7dd-cl45x"] Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.589920 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.591010 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.609487 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-config-data\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.609626 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-scripts\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.609663 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-combined-ca-bundle\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.609710 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plxmx\" (UniqueName: \"kubernetes.io/projected/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-kube-api-access-plxmx\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.609780 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-internal-tls-certs\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.609812 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-logs\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.609864 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-public-tls-certs\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.658486 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7bf9bcb7dd-cl45x"] Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.711637 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-scripts\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.711928 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-combined-ca-bundle\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.712054 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plxmx\" (UniqueName: \"kubernetes.io/projected/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-kube-api-access-plxmx\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.712150 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-internal-tls-certs\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.712260 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-logs\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.712365 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-public-tls-certs\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.712469 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-config-data\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.714749 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-logs\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.715926 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-scripts\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.731920 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-combined-ca-bundle\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.732584 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-internal-tls-certs\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.732611 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-config-data\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.733398 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-public-tls-certs\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.744144 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plxmx\" (UniqueName: \"kubernetes.io/projected/d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a-kube-api-access-plxmx\") pod \"placement-7bf9bcb7dd-cl45x\" (UID: \"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a\") " pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:41 crc kubenswrapper[4914]: I0130 21:34:41.917102 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:42 crc kubenswrapper[4914]: W0130 21:34:42.952698 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd54b2e31_fb6a_4ab1_b680_8aeb4a663b3a.slice/crio-a585daa338160af4afdd27fdef1eca473a438afff3f07e691b845e9e033e74a7 WatchSource:0}: Error finding container a585daa338160af4afdd27fdef1eca473a438afff3f07e691b845e9e033e74a7: Status 404 returned error can't find the container with id a585daa338160af4afdd27fdef1eca473a438afff3f07e691b845e9e033e74a7 Jan 30 21:34:42 crc kubenswrapper[4914]: I0130 21:34:42.957748 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f67697d54-9s42z"] Jan 30 21:34:42 crc kubenswrapper[4914]: I0130 21:34:42.968177 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7bf9bcb7dd-cl45x"] Jan 30 21:34:43 crc kubenswrapper[4914]: W0130 21:34:43.127175 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8f39fc5_1811_4872_b99a_4ba212837d75.slice/crio-928af9dcf5b68fc71ba8f502345320524c1f4cf51aecf8c8ef522f5a8d4f4767 WatchSource:0}: Error finding container 928af9dcf5b68fc71ba8f502345320524c1f4cf51aecf8c8ef522f5a8d4f4767: Status 404 returned error can't find the container with id 928af9dcf5b68fc71ba8f502345320524c1f4cf51aecf8c8ef522f5a8d4f4767 Jan 30 21:34:43 crc kubenswrapper[4914]: I0130 21:34:43.132263 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6c7c6d9b88-rpslq"] Jan 30 21:34:43 crc kubenswrapper[4914]: I0130 21:34:43.616187 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6c7c6d9b88-rpslq" event={"ID":"c8f39fc5-1811-4872-b99a-4ba212837d75","Type":"ContainerStarted","Data":"31ef8a28f60505bb67a7e98964c779689148d4baeeeb725e57d16a1f224ce63b"} Jan 30 21:34:43 crc kubenswrapper[4914]: I0130 21:34:43.616533 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6c7c6d9b88-rpslq" event={"ID":"c8f39fc5-1811-4872-b99a-4ba212837d75","Type":"ContainerStarted","Data":"928af9dcf5b68fc71ba8f502345320524c1f4cf51aecf8c8ef522f5a8d4f4767"} Jan 30 21:34:43 crc kubenswrapper[4914]: I0130 21:34:43.616588 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:34:43 crc kubenswrapper[4914]: I0130 21:34:43.618852 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bf9bcb7dd-cl45x" event={"ID":"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a","Type":"ContainerStarted","Data":"00899fd6e17474b51e6151488e19448ff0456083c45a33d60de088858f1b5a7a"} Jan 30 21:34:43 crc kubenswrapper[4914]: I0130 21:34:43.618883 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bf9bcb7dd-cl45x" event={"ID":"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a","Type":"ContainerStarted","Data":"a585daa338160af4afdd27fdef1eca473a438afff3f07e691b845e9e033e74a7"} Jan 30 21:34:43 crc kubenswrapper[4914]: I0130 21:34:43.638803 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6c7c6d9b88-rpslq" podStartSLOduration=2.638780847 podStartE2EDuration="2.638780847s" podCreationTimestamp="2026-01-30 21:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:34:43.632098236 +0000 UTC m=+1217.070734997" watchObservedRunningTime="2026-01-30 21:34:43.638780847 +0000 UTC m=+1217.077417608" Jan 30 21:34:43 crc kubenswrapper[4914]: I0130 21:34:43.642258 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f67697d54-9s42z" event={"ID":"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3","Type":"ContainerStarted","Data":"575be469f78a48937e8ec8cbb3339807ee05616b55a117e3cffaa41347bce35c"} Jan 30 21:34:43 crc kubenswrapper[4914]: I0130 21:34:43.642297 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f67697d54-9s42z" event={"ID":"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3","Type":"ContainerStarted","Data":"14b8d58021de566c4764d74583036778a2e0ed40c527b6288c4d67a38cda0e0a"} Jan 30 21:34:44 crc kubenswrapper[4914]: I0130 21:34:44.652758 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6c9nl" event={"ID":"f4afaeee-72ae-4c47-b842-d201151915c4","Type":"ContainerStarted","Data":"0f94a44c94e29ccb926b0c7329b8a2486a23c1a7486e21a995e3a0bc8eb6d347"} Jan 30 21:34:44 crc kubenswrapper[4914]: I0130 21:34:44.654656 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f67697d54-9s42z" event={"ID":"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3","Type":"ContainerStarted","Data":"df624bb50c29da314fb00868fd5987f09f1630d6d4682eab3c93a8ea78ec1c37"} Jan 30 21:34:44 crc kubenswrapper[4914]: I0130 21:34:44.654747 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:44 crc kubenswrapper[4914]: I0130 21:34:44.654792 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:34:44 crc kubenswrapper[4914]: I0130 21:34:44.656587 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc98d77b-bdf3-4a3b-bfad-95ef146a731e","Type":"ContainerStarted","Data":"813b915b9cce5ddbc4f5ff70db4cc012de1fc3f10e5e2baefe9883b9a4f0cd20"} Jan 30 21:34:44 crc kubenswrapper[4914]: I0130 21:34:44.658734 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bf9bcb7dd-cl45x" event={"ID":"d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a","Type":"ContainerStarted","Data":"1b9561b82f0e5c1f9f4e30b98dab4585a1afed5996159ab50ea432745cf3ea00"} Jan 30 21:34:44 crc kubenswrapper[4914]: I0130 21:34:44.676807 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-6c9nl" podStartSLOduration=3.025491712 podStartE2EDuration="58.676789235s" podCreationTimestamp="2026-01-30 21:33:46 +0000 UTC" firstStartedPulling="2026-01-30 21:33:48.226899627 +0000 UTC m=+1161.665536388" lastFinishedPulling="2026-01-30 21:34:43.87819715 +0000 UTC m=+1217.316833911" observedRunningTime="2026-01-30 21:34:44.667214434 +0000 UTC m=+1218.105851195" watchObservedRunningTime="2026-01-30 21:34:44.676789235 +0000 UTC m=+1218.115426016" Jan 30 21:34:44 crc kubenswrapper[4914]: I0130 21:34:44.689816 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7bf9bcb7dd-cl45x" podStartSLOduration=3.689798968 podStartE2EDuration="3.689798968s" podCreationTimestamp="2026-01-30 21:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:34:44.688396554 +0000 UTC m=+1218.127033325" watchObservedRunningTime="2026-01-30 21:34:44.689798968 +0000 UTC m=+1218.128435729" Jan 30 21:34:44 crc kubenswrapper[4914]: I0130 21:34:44.713133 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7f67697d54-9s42z" podStartSLOduration=3.713109779 podStartE2EDuration="3.713109779s" podCreationTimestamp="2026-01-30 21:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:34:44.711207844 +0000 UTC m=+1218.149844595" watchObservedRunningTime="2026-01-30 21:34:44.713109779 +0000 UTC m=+1218.151746550" Jan 30 21:34:45 crc kubenswrapper[4914]: I0130 21:34:45.667989 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6kskl" event={"ID":"b7fe1c6e-0858-479f-b365-081a1b8fcf2d","Type":"ContainerStarted","Data":"d8e8fbc6307d942d2de54b5e103cd757d9ee4b92d2a881c8a6c2adf44ff1d013"} Jan 30 21:34:45 crc kubenswrapper[4914]: I0130 21:34:45.669098 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:45 crc kubenswrapper[4914]: I0130 21:34:45.669199 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:34:45 crc kubenswrapper[4914]: I0130 21:34:45.693466 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6kskl" podStartSLOduration=5.138101144 podStartE2EDuration="1m0.693444058s" podCreationTimestamp="2026-01-30 21:33:45 +0000 UTC" firstStartedPulling="2026-01-30 21:33:48.167173509 +0000 UTC m=+1161.605810270" lastFinishedPulling="2026-01-30 21:34:43.722516423 +0000 UTC m=+1217.161153184" observedRunningTime="2026-01-30 21:34:45.682859113 +0000 UTC m=+1219.121495884" watchObservedRunningTime="2026-01-30 21:34:45.693444058 +0000 UTC m=+1219.132080819" Jan 30 21:34:46 crc kubenswrapper[4914]: I0130 21:34:46.688873 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-t4kd2" event={"ID":"a0548f63-8249-4708-88d9-b3f663b28778","Type":"ContainerStarted","Data":"ec1355b9b35d302affb5d502116cf5e28958ca83204218cbc0d3271ccc0855f3"} Jan 30 21:34:46 crc kubenswrapper[4914]: I0130 21:34:46.725704 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-t4kd2" podStartSLOduration=3.110326713 podStartE2EDuration="1m0.725683247s" podCreationTimestamp="2026-01-30 21:33:46 +0000 UTC" firstStartedPulling="2026-01-30 21:33:48.364909979 +0000 UTC m=+1161.803546740" lastFinishedPulling="2026-01-30 21:34:45.980266493 +0000 UTC m=+1219.418903274" observedRunningTime="2026-01-30 21:34:46.720445481 +0000 UTC m=+1220.159082242" watchObservedRunningTime="2026-01-30 21:34:46.725683247 +0000 UTC m=+1220.164320008" Jan 30 21:34:51 crc kubenswrapper[4914]: I0130 21:34:51.746975 4914 generic.go:334] "Generic (PLEG): container finished" podID="f4afaeee-72ae-4c47-b842-d201151915c4" containerID="0f94a44c94e29ccb926b0c7329b8a2486a23c1a7486e21a995e3a0bc8eb6d347" exitCode=0 Jan 30 21:34:51 crc kubenswrapper[4914]: I0130 21:34:51.747158 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6c9nl" event={"ID":"f4afaeee-72ae-4c47-b842-d201151915c4","Type":"ContainerDied","Data":"0f94a44c94e29ccb926b0c7329b8a2486a23c1a7486e21a995e3a0bc8eb6d347"} Jan 30 21:34:53 crc kubenswrapper[4914]: I0130 21:34:53.399211 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6c9nl" Jan 30 21:34:53 crc kubenswrapper[4914]: I0130 21:34:53.559810 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqvqs\" (UniqueName: \"kubernetes.io/projected/f4afaeee-72ae-4c47-b842-d201151915c4-kube-api-access-vqvqs\") pod \"f4afaeee-72ae-4c47-b842-d201151915c4\" (UID: \"f4afaeee-72ae-4c47-b842-d201151915c4\") " Jan 30 21:34:53 crc kubenswrapper[4914]: I0130 21:34:53.559875 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4afaeee-72ae-4c47-b842-d201151915c4-db-sync-config-data\") pod \"f4afaeee-72ae-4c47-b842-d201151915c4\" (UID: \"f4afaeee-72ae-4c47-b842-d201151915c4\") " Jan 30 21:34:53 crc kubenswrapper[4914]: I0130 21:34:53.559932 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4afaeee-72ae-4c47-b842-d201151915c4-combined-ca-bundle\") pod \"f4afaeee-72ae-4c47-b842-d201151915c4\" (UID: \"f4afaeee-72ae-4c47-b842-d201151915c4\") " Jan 30 21:34:53 crc kubenswrapper[4914]: I0130 21:34:53.566956 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4afaeee-72ae-4c47-b842-d201151915c4-kube-api-access-vqvqs" (OuterVolumeSpecName: "kube-api-access-vqvqs") pod "f4afaeee-72ae-4c47-b842-d201151915c4" (UID: "f4afaeee-72ae-4c47-b842-d201151915c4"). InnerVolumeSpecName "kube-api-access-vqvqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:34:53 crc kubenswrapper[4914]: I0130 21:34:53.569733 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4afaeee-72ae-4c47-b842-d201151915c4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f4afaeee-72ae-4c47-b842-d201151915c4" (UID: "f4afaeee-72ae-4c47-b842-d201151915c4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:53 crc kubenswrapper[4914]: I0130 21:34:53.584851 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4afaeee-72ae-4c47-b842-d201151915c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4afaeee-72ae-4c47-b842-d201151915c4" (UID: "f4afaeee-72ae-4c47-b842-d201151915c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:34:53 crc kubenswrapper[4914]: I0130 21:34:53.662171 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4afaeee-72ae-4c47-b842-d201151915c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:53 crc kubenswrapper[4914]: I0130 21:34:53.662209 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqvqs\" (UniqueName: \"kubernetes.io/projected/f4afaeee-72ae-4c47-b842-d201151915c4-kube-api-access-vqvqs\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:53 crc kubenswrapper[4914]: I0130 21:34:53.662221 4914 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4afaeee-72ae-4c47-b842-d201151915c4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:34:53 crc kubenswrapper[4914]: I0130 21:34:53.772580 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6c9nl" event={"ID":"f4afaeee-72ae-4c47-b842-d201151915c4","Type":"ContainerDied","Data":"e2d5e5ab23149a97e4d792533a5b619b5a5dd3940bdc1a9bc9a4c9ee4a9b7e22"} Jan 30 21:34:53 crc kubenswrapper[4914]: I0130 21:34:53.772616 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2d5e5ab23149a97e4d792533a5b619b5a5dd3940bdc1a9bc9a4c9ee4a9b7e22" Jan 30 21:34:53 crc kubenswrapper[4914]: I0130 21:34:53.772623 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6c9nl" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.037770 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5d6d6d58ff-ngrc6"] Jan 30 21:34:54 crc kubenswrapper[4914]: E0130 21:34:54.038408 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4afaeee-72ae-4c47-b842-d201151915c4" containerName="barbican-db-sync" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.038424 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4afaeee-72ae-4c47-b842-d201151915c4" containerName="barbican-db-sync" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.038611 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4afaeee-72ae-4c47-b842-d201151915c4" containerName="barbican-db-sync" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.039584 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.042266 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.042838 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.043104 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9b5p4" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.051235 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d6d6d58ff-ngrc6"] Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.077050 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-d84ffccf8-2q5ts"] Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.078651 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.083861 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.124675 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-d84ffccf8-2q5ts"] Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.170650 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c844063-c103-4c6e-92ae-d9f1e0e897eb-config-data-custom\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.170747 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37896a03-ab80-432f-b7b0-490652061464-config-data-custom\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.170786 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37896a03-ab80-432f-b7b0-490652061464-logs\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.170821 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9xt7\" (UniqueName: \"kubernetes.io/projected/3c844063-c103-4c6e-92ae-d9f1e0e897eb-kube-api-access-q9xt7\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.170857 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c844063-c103-4c6e-92ae-d9f1e0e897eb-config-data\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.170893 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37896a03-ab80-432f-b7b0-490652061464-config-data\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.170915 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5cbh\" (UniqueName: \"kubernetes.io/projected/37896a03-ab80-432f-b7b0-490652061464-kube-api-access-r5cbh\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.170952 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c844063-c103-4c6e-92ae-d9f1e0e897eb-combined-ca-bundle\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.171023 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37896a03-ab80-432f-b7b0-490652061464-combined-ca-bundle\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.171055 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c844063-c103-4c6e-92ae-d9f1e0e897eb-logs\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.186145 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-s7pdh"] Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.188137 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.201337 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-s7pdh"] Jan 30 21:34:54 crc kubenswrapper[4914]: E0130 21:34:54.264944 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.272287 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37896a03-ab80-432f-b7b0-490652061464-logs\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.272331 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9xt7\" (UniqueName: \"kubernetes.io/projected/3c844063-c103-4c6e-92ae-d9f1e0e897eb-kube-api-access-q9xt7\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.272369 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c844063-c103-4c6e-92ae-d9f1e0e897eb-config-data\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.272401 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37896a03-ab80-432f-b7b0-490652061464-config-data\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.272417 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5cbh\" (UniqueName: \"kubernetes.io/projected/37896a03-ab80-432f-b7b0-490652061464-kube-api-access-r5cbh\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.272443 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c844063-c103-4c6e-92ae-d9f1e0e897eb-combined-ca-bundle\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.272493 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37896a03-ab80-432f-b7b0-490652061464-combined-ca-bundle\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.272516 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c844063-c103-4c6e-92ae-d9f1e0e897eb-logs\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.272560 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c844063-c103-4c6e-92ae-d9f1e0e897eb-config-data-custom\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.272600 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37896a03-ab80-432f-b7b0-490652061464-config-data-custom\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.273602 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37896a03-ab80-432f-b7b0-490652061464-logs\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.275005 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c844063-c103-4c6e-92ae-d9f1e0e897eb-logs\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.279805 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37896a03-ab80-432f-b7b0-490652061464-config-data\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.283275 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37896a03-ab80-432f-b7b0-490652061464-combined-ca-bundle\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.283810 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37896a03-ab80-432f-b7b0-490652061464-config-data-custom\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.289941 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c844063-c103-4c6e-92ae-d9f1e0e897eb-combined-ca-bundle\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.292317 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c844063-c103-4c6e-92ae-d9f1e0e897eb-config-data\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.294791 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c844063-c103-4c6e-92ae-d9f1e0e897eb-config-data-custom\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.301756 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9xt7\" (UniqueName: \"kubernetes.io/projected/3c844063-c103-4c6e-92ae-d9f1e0e897eb-kube-api-access-q9xt7\") pod \"barbican-worker-5d6d6d58ff-ngrc6\" (UID: \"3c844063-c103-4c6e-92ae-d9f1e0e897eb\") " pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.305554 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-569449474-v9hfx"] Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.305989 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5cbh\" (UniqueName: \"kubernetes.io/projected/37896a03-ab80-432f-b7b0-490652061464-kube-api-access-r5cbh\") pod \"barbican-keystone-listener-d84ffccf8-2q5ts\" (UID: \"37896a03-ab80-432f-b7b0-490652061464\") " pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.307102 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.310979 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.323526 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-569449474-v9hfx"] Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.375481 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.375524 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.375560 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.375575 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-config\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.375594 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.375621 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc49l\" (UniqueName: \"kubernetes.io/projected/db65d505-10ef-4668-9e71-f4faa42d4915-kube-api-access-sc49l\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.423857 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.446150 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.478662 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-config-data\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.478956 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.478988 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.479026 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-combined-ca-bundle\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.479046 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2d0533c-95eb-41ee-af58-d85e832a41d3-logs\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.479072 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.479090 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-config\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.479106 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-config-data-custom\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.479125 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.479158 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc49l\" (UniqueName: \"kubernetes.io/projected/db65d505-10ef-4668-9e71-f4faa42d4915-kube-api-access-sc49l\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.479199 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvqtl\" (UniqueName: \"kubernetes.io/projected/c2d0533c-95eb-41ee-af58-d85e832a41d3-kube-api-access-gvqtl\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.480283 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.480974 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.481520 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.482177 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-config\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.482758 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.503418 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc49l\" (UniqueName: \"kubernetes.io/projected/db65d505-10ef-4668-9e71-f4faa42d4915-kube-api-access-sc49l\") pod \"dnsmasq-dns-586bdc5f9-s7pdh\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.535257 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.580717 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-config-data-custom\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.580791 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvqtl\" (UniqueName: \"kubernetes.io/projected/c2d0533c-95eb-41ee-af58-d85e832a41d3-kube-api-access-gvqtl\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.580880 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-config-data\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.580935 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-combined-ca-bundle\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.580954 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2d0533c-95eb-41ee-af58-d85e832a41d3-logs\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.581274 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2d0533c-95eb-41ee-af58-d85e832a41d3-logs\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.608528 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-config-data-custom\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.611356 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvqtl\" (UniqueName: \"kubernetes.io/projected/c2d0533c-95eb-41ee-af58-d85e832a41d3-kube-api-access-gvqtl\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.612465 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-combined-ca-bundle\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.628247 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-config-data\") pod \"barbican-api-569449474-v9hfx\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.696475 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.843279 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc98d77b-bdf3-4a3b-bfad-95ef146a731e","Type":"ContainerStarted","Data":"fbf466f723708d3168d7da6a41c9a0499438bbd7578b061b795e5fb642ffc7d8"} Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.843782 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerName="ceilometer-notification-agent" containerID="cri-o://c3f7ddb56e28cb00fe58eaceb456a6dc9e7b176ecf9ccb7aa5326fa1ce9fdf5d" gracePeriod=30 Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.843865 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.844330 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerName="proxy-httpd" containerID="cri-o://fbf466f723708d3168d7da6a41c9a0499438bbd7578b061b795e5fb642ffc7d8" gracePeriod=30 Jan 30 21:34:54 crc kubenswrapper[4914]: I0130 21:34:54.844387 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerName="sg-core" containerID="cri-o://813b915b9cce5ddbc4f5ff70db4cc012de1fc3f10e5e2baefe9883b9a4f0cd20" gracePeriod=30 Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.126923 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d6d6d58ff-ngrc6"] Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.141370 4914 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.326668 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-d84ffccf8-2q5ts"] Jan 30 21:34:55 crc kubenswrapper[4914]: W0130 21:34:55.363335 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37896a03_ab80_432f_b7b0_490652061464.slice/crio-dc5fbbbc2e29d5883bf2e4918aec407ded32a61288fcd5a62f0b28977fc30066 WatchSource:0}: Error finding container dc5fbbbc2e29d5883bf2e4918aec407ded32a61288fcd5a62f0b28977fc30066: Status 404 returned error can't find the container with id dc5fbbbc2e29d5883bf2e4918aec407ded32a61288fcd5a62f0b28977fc30066 Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.390765 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-569449474-v9hfx"] Jan 30 21:34:55 crc kubenswrapper[4914]: W0130 21:34:55.406690 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb65d505_10ef_4668_9e71_f4faa42d4915.slice/crio-327bfabfc70ba9a75011460e48f0463a19b1384ac47327912a0fa996fa51d2f9 WatchSource:0}: Error finding container 327bfabfc70ba9a75011460e48f0463a19b1384ac47327912a0fa996fa51d2f9: Status 404 returned error can't find the container with id 327bfabfc70ba9a75011460e48f0463a19b1384ac47327912a0fa996fa51d2f9 Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.407527 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-s7pdh"] Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.854393 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" event={"ID":"37896a03-ab80-432f-b7b0-490652061464","Type":"ContainerStarted","Data":"dc5fbbbc2e29d5883bf2e4918aec407ded32a61288fcd5a62f0b28977fc30066"} Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.859343 4914 generic.go:334] "Generic (PLEG): container finished" podID="db65d505-10ef-4668-9e71-f4faa42d4915" containerID="e1e53143c3b159e9406fb36aa32a117e13115226bfffe7c0e8aa69db54302785" exitCode=0 Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.860461 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" event={"ID":"db65d505-10ef-4668-9e71-f4faa42d4915","Type":"ContainerDied","Data":"e1e53143c3b159e9406fb36aa32a117e13115226bfffe7c0e8aa69db54302785"} Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.860528 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" event={"ID":"db65d505-10ef-4668-9e71-f4faa42d4915","Type":"ContainerStarted","Data":"327bfabfc70ba9a75011460e48f0463a19b1384ac47327912a0fa996fa51d2f9"} Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.870652 4914 generic.go:334] "Generic (PLEG): container finished" podID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerID="fbf466f723708d3168d7da6a41c9a0499438bbd7578b061b795e5fb642ffc7d8" exitCode=0 Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.870687 4914 generic.go:334] "Generic (PLEG): container finished" podID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerID="813b915b9cce5ddbc4f5ff70db4cc012de1fc3f10e5e2baefe9883b9a4f0cd20" exitCode=2 Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.870743 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc98d77b-bdf3-4a3b-bfad-95ef146a731e","Type":"ContainerDied","Data":"fbf466f723708d3168d7da6a41c9a0499438bbd7578b061b795e5fb642ffc7d8"} Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.870824 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc98d77b-bdf3-4a3b-bfad-95ef146a731e","Type":"ContainerDied","Data":"813b915b9cce5ddbc4f5ff70db4cc012de1fc3f10e5e2baefe9883b9a4f0cd20"} Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.873546 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-569449474-v9hfx" event={"ID":"c2d0533c-95eb-41ee-af58-d85e832a41d3","Type":"ContainerStarted","Data":"d744e6181492333e1babbb97873cfb973aa9345927d3f77bd5a539f61709e2d5"} Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.873582 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-569449474-v9hfx" event={"ID":"c2d0533c-95eb-41ee-af58-d85e832a41d3","Type":"ContainerStarted","Data":"95c7e4ee3206628abdc5309b1af973b80d030fceef6ac6a126acfc8d37728adb"} Jan 30 21:34:55 crc kubenswrapper[4914]: I0130 21:34:55.874876 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" event={"ID":"3c844063-c103-4c6e-92ae-d9f1e0e897eb","Type":"ContainerStarted","Data":"8e7c3aad1e8e226773a4e77bd5e5321b9ccc8a4f7417e7cb223f2660c8a1bb78"} Jan 30 21:34:56 crc kubenswrapper[4914]: I0130 21:34:56.888258 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" event={"ID":"db65d505-10ef-4668-9e71-f4faa42d4915","Type":"ContainerStarted","Data":"53e501207acc98be1b003614fbe2daefa596b9824a980ced3eba65d7d7c9eb38"} Jan 30 21:34:56 crc kubenswrapper[4914]: I0130 21:34:56.888785 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:34:56 crc kubenswrapper[4914]: I0130 21:34:56.890698 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-569449474-v9hfx" event={"ID":"c2d0533c-95eb-41ee-af58-d85e832a41d3","Type":"ContainerStarted","Data":"09db82c5b7273909327fe13b74f0eac9fe7953343eefb01fd9a29b44cec5adba"} Jan 30 21:34:56 crc kubenswrapper[4914]: I0130 21:34:56.891504 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:56 crc kubenswrapper[4914]: I0130 21:34:56.891535 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:34:56 crc kubenswrapper[4914]: I0130 21:34:56.955381 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" podStartSLOduration=2.955355194 podStartE2EDuration="2.955355194s" podCreationTimestamp="2026-01-30 21:34:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:34:56.917075602 +0000 UTC m=+1230.355712363" watchObservedRunningTime="2026-01-30 21:34:56.955355194 +0000 UTC m=+1230.393991965" Jan 30 21:34:56 crc kubenswrapper[4914]: I0130 21:34:56.986169 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:34:56 crc kubenswrapper[4914]: I0130 21:34:56.986221 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.238418 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-569449474-v9hfx" podStartSLOduration=3.2384026280000002 podStartE2EDuration="3.238402628s" podCreationTimestamp="2026-01-30 21:34:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:34:56.956582883 +0000 UTC m=+1230.395219644" watchObservedRunningTime="2026-01-30 21:34:57.238402628 +0000 UTC m=+1230.677039389" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.242828 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-975f98546-d2x5z"] Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.244339 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.263706 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.264721 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.275045 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-975f98546-d2x5z"] Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.343280 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-config-data-custom\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.343343 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-config-data\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.343489 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-public-tls-certs\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.343587 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-combined-ca-bundle\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.343845 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-internal-tls-certs\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.344048 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b076a462-e37e-496a-9587-78a4f9b07232-logs\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.344175 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2crd\" (UniqueName: \"kubernetes.io/projected/b076a462-e37e-496a-9587-78a4f9b07232-kube-api-access-f2crd\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.446109 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-combined-ca-bundle\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.446191 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-internal-tls-certs\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.446242 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b076a462-e37e-496a-9587-78a4f9b07232-logs\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.446271 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2crd\" (UniqueName: \"kubernetes.io/projected/b076a462-e37e-496a-9587-78a4f9b07232-kube-api-access-f2crd\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.446317 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-config-data-custom\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.446349 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-config-data\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.446383 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-public-tls-certs\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.447424 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b076a462-e37e-496a-9587-78a4f9b07232-logs\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.460408 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-public-tls-certs\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.468325 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-internal-tls-certs\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.468920 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-config-data-custom\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.475380 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-combined-ca-bundle\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.477995 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2crd\" (UniqueName: \"kubernetes.io/projected/b076a462-e37e-496a-9587-78a4f9b07232-kube-api-access-f2crd\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.481064 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b076a462-e37e-496a-9587-78a4f9b07232-config-data\") pod \"barbican-api-975f98546-d2x5z\" (UID: \"b076a462-e37e-496a-9587-78a4f9b07232\") " pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.523279 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.529428 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 21:34:57 crc kubenswrapper[4914]: I0130 21:34:57.564039 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:34:59 crc kubenswrapper[4914]: I0130 21:34:59.935891 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:34:59 crc kubenswrapper[4914]: I0130 21:34:59.940477 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" event={"ID":"3c844063-c103-4c6e-92ae-d9f1e0e897eb","Type":"ContainerStarted","Data":"6a1763c8dcd096230441f4109715220f39f64b2c9e84e1a8d3967c5d8585d2b1"} Jan 30 21:34:59 crc kubenswrapper[4914]: I0130 21:34:59.944757 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" event={"ID":"37896a03-ab80-432f-b7b0-490652061464","Type":"ContainerStarted","Data":"360df8b442f10a39814f887bbf3b65aac00a270e3e426112c584125b6f6c7d22"} Jan 30 21:34:59 crc kubenswrapper[4914]: I0130 21:34:59.949165 4914 generic.go:334] "Generic (PLEG): container finished" podID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerID="c3f7ddb56e28cb00fe58eaceb456a6dc9e7b176ecf9ccb7aa5326fa1ce9fdf5d" exitCode=0 Jan 30 21:34:59 crc kubenswrapper[4914]: I0130 21:34:59.949230 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc98d77b-bdf3-4a3b-bfad-95ef146a731e","Type":"ContainerDied","Data":"c3f7ddb56e28cb00fe58eaceb456a6dc9e7b176ecf9ccb7aa5326fa1ce9fdf5d"} Jan 30 21:34:59 crc kubenswrapper[4914]: I0130 21:34:59.949251 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc98d77b-bdf3-4a3b-bfad-95ef146a731e","Type":"ContainerDied","Data":"d8b92462429d341fb7fc009c88578796c99adfb9b32537691c30289cca7cdab9"} Jan 30 21:34:59 crc kubenswrapper[4914]: I0130 21:34:59.949269 4914 scope.go:117] "RemoveContainer" containerID="fbf466f723708d3168d7da6a41c9a0499438bbd7578b061b795e5fb642ffc7d8" Jan 30 21:34:59 crc kubenswrapper[4914]: I0130 21:34:59.949406 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:34:59 crc kubenswrapper[4914]: I0130 21:34:59.990597 4914 scope.go:117] "RemoveContainer" containerID="813b915b9cce5ddbc4f5ff70db4cc012de1fc3f10e5e2baefe9883b9a4f0cd20" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.013410 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-scripts\") pod \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.013453 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-config-data\") pod \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.013565 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jxxd\" (UniqueName: \"kubernetes.io/projected/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-kube-api-access-4jxxd\") pod \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.013587 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-log-httpd\") pod \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.013672 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-sg-core-conf-yaml\") pod \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.013775 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-run-httpd\") pod \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.013816 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-combined-ca-bundle\") pod \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\" (UID: \"dc98d77b-bdf3-4a3b-bfad-95ef146a731e\") " Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.015506 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc98d77b-bdf3-4a3b-bfad-95ef146a731e" (UID: "dc98d77b-bdf3-4a3b-bfad-95ef146a731e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.015536 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc98d77b-bdf3-4a3b-bfad-95ef146a731e" (UID: "dc98d77b-bdf3-4a3b-bfad-95ef146a731e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.019937 4914 scope.go:117] "RemoveContainer" containerID="c3f7ddb56e28cb00fe58eaceb456a6dc9e7b176ecf9ccb7aa5326fa1ce9fdf5d" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.025313 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-kube-api-access-4jxxd" (OuterVolumeSpecName: "kube-api-access-4jxxd") pod "dc98d77b-bdf3-4a3b-bfad-95ef146a731e" (UID: "dc98d77b-bdf3-4a3b-bfad-95ef146a731e"). InnerVolumeSpecName "kube-api-access-4jxxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.026255 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-scripts" (OuterVolumeSpecName: "scripts") pod "dc98d77b-bdf3-4a3b-bfad-95ef146a731e" (UID: "dc98d77b-bdf3-4a3b-bfad-95ef146a731e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.049254 4914 scope.go:117] "RemoveContainer" containerID="fbf466f723708d3168d7da6a41c9a0499438bbd7578b061b795e5fb642ffc7d8" Jan 30 21:35:00 crc kubenswrapper[4914]: E0130 21:35:00.049659 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbf466f723708d3168d7da6a41c9a0499438bbd7578b061b795e5fb642ffc7d8\": container with ID starting with fbf466f723708d3168d7da6a41c9a0499438bbd7578b061b795e5fb642ffc7d8 not found: ID does not exist" containerID="fbf466f723708d3168d7da6a41c9a0499438bbd7578b061b795e5fb642ffc7d8" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.049728 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbf466f723708d3168d7da6a41c9a0499438bbd7578b061b795e5fb642ffc7d8"} err="failed to get container status \"fbf466f723708d3168d7da6a41c9a0499438bbd7578b061b795e5fb642ffc7d8\": rpc error: code = NotFound desc = could not find container \"fbf466f723708d3168d7da6a41c9a0499438bbd7578b061b795e5fb642ffc7d8\": container with ID starting with fbf466f723708d3168d7da6a41c9a0499438bbd7578b061b795e5fb642ffc7d8 not found: ID does not exist" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.049770 4914 scope.go:117] "RemoveContainer" containerID="813b915b9cce5ddbc4f5ff70db4cc012de1fc3f10e5e2baefe9883b9a4f0cd20" Jan 30 21:35:00 crc kubenswrapper[4914]: E0130 21:35:00.051590 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"813b915b9cce5ddbc4f5ff70db4cc012de1fc3f10e5e2baefe9883b9a4f0cd20\": container with ID starting with 813b915b9cce5ddbc4f5ff70db4cc012de1fc3f10e5e2baefe9883b9a4f0cd20 not found: ID does not exist" containerID="813b915b9cce5ddbc4f5ff70db4cc012de1fc3f10e5e2baefe9883b9a4f0cd20" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.051624 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813b915b9cce5ddbc4f5ff70db4cc012de1fc3f10e5e2baefe9883b9a4f0cd20"} err="failed to get container status \"813b915b9cce5ddbc4f5ff70db4cc012de1fc3f10e5e2baefe9883b9a4f0cd20\": rpc error: code = NotFound desc = could not find container \"813b915b9cce5ddbc4f5ff70db4cc012de1fc3f10e5e2baefe9883b9a4f0cd20\": container with ID starting with 813b915b9cce5ddbc4f5ff70db4cc012de1fc3f10e5e2baefe9883b9a4f0cd20 not found: ID does not exist" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.051647 4914 scope.go:117] "RemoveContainer" containerID="c3f7ddb56e28cb00fe58eaceb456a6dc9e7b176ecf9ccb7aa5326fa1ce9fdf5d" Jan 30 21:35:00 crc kubenswrapper[4914]: E0130 21:35:00.051910 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f7ddb56e28cb00fe58eaceb456a6dc9e7b176ecf9ccb7aa5326fa1ce9fdf5d\": container with ID starting with c3f7ddb56e28cb00fe58eaceb456a6dc9e7b176ecf9ccb7aa5326fa1ce9fdf5d not found: ID does not exist" containerID="c3f7ddb56e28cb00fe58eaceb456a6dc9e7b176ecf9ccb7aa5326fa1ce9fdf5d" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.051941 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f7ddb56e28cb00fe58eaceb456a6dc9e7b176ecf9ccb7aa5326fa1ce9fdf5d"} err="failed to get container status \"c3f7ddb56e28cb00fe58eaceb456a6dc9e7b176ecf9ccb7aa5326fa1ce9fdf5d\": rpc error: code = NotFound desc = could not find container \"c3f7ddb56e28cb00fe58eaceb456a6dc9e7b176ecf9ccb7aa5326fa1ce9fdf5d\": container with ID starting with c3f7ddb56e28cb00fe58eaceb456a6dc9e7b176ecf9ccb7aa5326fa1ce9fdf5d not found: ID does not exist" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.070605 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc98d77b-bdf3-4a3b-bfad-95ef146a731e" (UID: "dc98d77b-bdf3-4a3b-bfad-95ef146a731e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.100962 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc98d77b-bdf3-4a3b-bfad-95ef146a731e" (UID: "dc98d77b-bdf3-4a3b-bfad-95ef146a731e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.116410 4914 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.116443 4914 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.116452 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.116461 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.116470 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jxxd\" (UniqueName: \"kubernetes.io/projected/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-kube-api-access-4jxxd\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.116480 4914 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.125679 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-config-data" (OuterVolumeSpecName: "config-data") pod "dc98d77b-bdf3-4a3b-bfad-95ef146a731e" (UID: "dc98d77b-bdf3-4a3b-bfad-95ef146a731e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.155221 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-975f98546-d2x5z"] Jan 30 21:35:00 crc kubenswrapper[4914]: W0130 21:35:00.160338 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb076a462_e37e_496a_9587_78a4f9b07232.slice/crio-13eb6a86c3d83f3d280d20673df784016c90e30b012c31253b2b9241c561531d WatchSource:0}: Error finding container 13eb6a86c3d83f3d280d20673df784016c90e30b012c31253b2b9241c561531d: Status 404 returned error can't find the container with id 13eb6a86c3d83f3d280d20673df784016c90e30b012c31253b2b9241c561531d Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.218102 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc98d77b-bdf3-4a3b-bfad-95ef146a731e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.320153 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.328871 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.341462 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:00 crc kubenswrapper[4914]: E0130 21:35:00.341881 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerName="sg-core" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.341893 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerName="sg-core" Jan 30 21:35:00 crc kubenswrapper[4914]: E0130 21:35:00.341917 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerName="proxy-httpd" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.341923 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerName="proxy-httpd" Jan 30 21:35:00 crc kubenswrapper[4914]: E0130 21:35:00.341946 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerName="ceilometer-notification-agent" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.341952 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerName="ceilometer-notification-agent" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.342200 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerName="ceilometer-notification-agent" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.342215 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerName="proxy-httpd" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.342228 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" containerName="sg-core" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.344795 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.347195 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.348241 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.356591 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.422965 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.423337 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.423377 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-run-httpd\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.423485 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-log-httpd\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.423504 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-scripts\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.423573 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-config-data\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.423799 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6klk\" (UniqueName: \"kubernetes.io/projected/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-kube-api-access-z6klk\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.525508 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.525566 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-run-httpd\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.525640 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-log-httpd\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.525666 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-scripts\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.525689 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-config-data\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.525882 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6klk\" (UniqueName: \"kubernetes.io/projected/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-kube-api-access-z6klk\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.525974 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.526543 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-run-httpd\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.526964 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-log-httpd\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.530101 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-scripts\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.530565 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.530567 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.542415 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-config-data\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.543911 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6klk\" (UniqueName: \"kubernetes.io/projected/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-kube-api-access-z6klk\") pod \"ceilometer-0\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.661116 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.967941 4914 generic.go:334] "Generic (PLEG): container finished" podID="6748bae8-dcab-4fdb-ab49-b60893908a7f" containerID="7aa1e887ff1d5b5fa539da14db28a32b20602da026fc610316282f7c7f00dfdf" exitCode=0 Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.968151 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-46tqv" event={"ID":"6748bae8-dcab-4fdb-ab49-b60893908a7f","Type":"ContainerDied","Data":"7aa1e887ff1d5b5fa539da14db28a32b20602da026fc610316282f7c7f00dfdf"} Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.978168 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" event={"ID":"3c844063-c103-4c6e-92ae-d9f1e0e897eb","Type":"ContainerStarted","Data":"43a5a5e89613ef28f8d3682678dc42189a8ff41cc592e9fb02d37b3322ddb026"} Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.997154 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-975f98546-d2x5z" event={"ID":"b076a462-e37e-496a-9587-78a4f9b07232","Type":"ContainerStarted","Data":"7213b0f077427d9ac4e2d5038f6408bb6a3b76686f0897c08116255e79de407d"} Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.997194 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-975f98546-d2x5z" event={"ID":"b076a462-e37e-496a-9587-78a4f9b07232","Type":"ContainerStarted","Data":"742748fd629376f5245b03ecfb8cca383f6cb5125243be7a660e612a40b5e6a8"} Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.997203 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-975f98546-d2x5z" event={"ID":"b076a462-e37e-496a-9587-78a4f9b07232","Type":"ContainerStarted","Data":"13eb6a86c3d83f3d280d20673df784016c90e30b012c31253b2b9241c561531d"} Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.997268 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.997288 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:35:00 crc kubenswrapper[4914]: I0130 21:35:00.998804 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" event={"ID":"37896a03-ab80-432f-b7b0-490652061464","Type":"ContainerStarted","Data":"c317f9f7c58224078d6962af4ffe3d6759b4ef0e0cfb57fddd2ed34572c4867f"} Jan 30 21:35:01 crc kubenswrapper[4914]: I0130 21:35:01.002443 4914 generic.go:334] "Generic (PLEG): container finished" podID="b7fe1c6e-0858-479f-b365-081a1b8fcf2d" containerID="d8e8fbc6307d942d2de54b5e103cd757d9ee4b92d2a881c8a6c2adf44ff1d013" exitCode=0 Jan 30 21:35:01 crc kubenswrapper[4914]: I0130 21:35:01.002481 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6kskl" event={"ID":"b7fe1c6e-0858-479f-b365-081a1b8fcf2d","Type":"ContainerDied","Data":"d8e8fbc6307d942d2de54b5e103cd757d9ee4b92d2a881c8a6c2adf44ff1d013"} Jan 30 21:35:01 crc kubenswrapper[4914]: I0130 21:35:01.036934 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5d6d6d58ff-ngrc6" podStartSLOduration=2.713188851 podStartE2EDuration="7.036915377s" podCreationTimestamp="2026-01-30 21:34:54 +0000 UTC" firstStartedPulling="2026-01-30 21:34:55.141139681 +0000 UTC m=+1228.579776442" lastFinishedPulling="2026-01-30 21:34:59.464866197 +0000 UTC m=+1232.903502968" observedRunningTime="2026-01-30 21:35:01.016341224 +0000 UTC m=+1234.454977985" watchObservedRunningTime="2026-01-30 21:35:01.036915377 +0000 UTC m=+1234.475552138" Jan 30 21:35:01 crc kubenswrapper[4914]: I0130 21:35:01.047607 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-d84ffccf8-2q5ts" podStartSLOduration=2.940300299 podStartE2EDuration="7.047590563s" podCreationTimestamp="2026-01-30 21:34:54 +0000 UTC" firstStartedPulling="2026-01-30 21:34:55.366407094 +0000 UTC m=+1228.805043855" lastFinishedPulling="2026-01-30 21:34:59.473697348 +0000 UTC m=+1232.912334119" observedRunningTime="2026-01-30 21:35:01.045622546 +0000 UTC m=+1234.484259307" watchObservedRunningTime="2026-01-30 21:35:01.047590563 +0000 UTC m=+1234.486227324" Jan 30 21:35:01 crc kubenswrapper[4914]: I0130 21:35:01.097215 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-975f98546-d2x5z" podStartSLOduration=4.097198832 podStartE2EDuration="4.097198832s" podCreationTimestamp="2026-01-30 21:34:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:01.096212588 +0000 UTC m=+1234.534849349" watchObservedRunningTime="2026-01-30 21:35:01.097198832 +0000 UTC m=+1234.535835593" Jan 30 21:35:01 crc kubenswrapper[4914]: W0130 21:35:01.147189 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63f3f352_1ffb_48b4_b985_0d2d2206c7c1.slice/crio-11c941e0f9b3fbe702d97ee6634204afef0ea525191b8182593a646de41fed21 WatchSource:0}: Error finding container 11c941e0f9b3fbe702d97ee6634204afef0ea525191b8182593a646de41fed21: Status 404 returned error can't find the container with id 11c941e0f9b3fbe702d97ee6634204afef0ea525191b8182593a646de41fed21 Jan 30 21:35:01 crc kubenswrapper[4914]: I0130 21:35:01.153921 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:01 crc kubenswrapper[4914]: I0130 21:35:01.841229 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc98d77b-bdf3-4a3b-bfad-95ef146a731e" path="/var/lib/kubelet/pods/dc98d77b-bdf3-4a3b-bfad-95ef146a731e/volumes" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.012507 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f3f352-1ffb-48b4-b985-0d2d2206c7c1","Type":"ContainerStarted","Data":"11c941e0f9b3fbe702d97ee6634204afef0ea525191b8182593a646de41fed21"} Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.016325 4914 generic.go:334] "Generic (PLEG): container finished" podID="a0548f63-8249-4708-88d9-b3f663b28778" containerID="ec1355b9b35d302affb5d502116cf5e28958ca83204218cbc0d3271ccc0855f3" exitCode=0 Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.016555 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-t4kd2" event={"ID":"a0548f63-8249-4708-88d9-b3f663b28778","Type":"ContainerDied","Data":"ec1355b9b35d302affb5d502116cf5e28958ca83204218cbc0d3271ccc0855f3"} Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.550535 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-46tqv" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.554547 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6kskl" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.670387 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6748bae8-dcab-4fdb-ab49-b60893908a7f-config\") pod \"6748bae8-dcab-4fdb-ab49-b60893908a7f\" (UID: \"6748bae8-dcab-4fdb-ab49-b60893908a7f\") " Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.670456 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-config-data\") pod \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.670494 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6748bae8-dcab-4fdb-ab49-b60893908a7f-combined-ca-bundle\") pod \"6748bae8-dcab-4fdb-ab49-b60893908a7f\" (UID: \"6748bae8-dcab-4fdb-ab49-b60893908a7f\") " Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.670537 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-etc-machine-id\") pod \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.670580 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-combined-ca-bundle\") pod \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.670700 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-scripts\") pod \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.670880 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmt8w\" (UniqueName: \"kubernetes.io/projected/6748bae8-dcab-4fdb-ab49-b60893908a7f-kube-api-access-qmt8w\") pod \"6748bae8-dcab-4fdb-ab49-b60893908a7f\" (UID: \"6748bae8-dcab-4fdb-ab49-b60893908a7f\") " Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.670970 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-db-sync-config-data\") pod \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.671031 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x48gj\" (UniqueName: \"kubernetes.io/projected/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-kube-api-access-x48gj\") pod \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\" (UID: \"b7fe1c6e-0858-479f-b365-081a1b8fcf2d\") " Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.671170 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b7fe1c6e-0858-479f-b365-081a1b8fcf2d" (UID: "b7fe1c6e-0858-479f-b365-081a1b8fcf2d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.671791 4914 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.676348 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-kube-api-access-x48gj" (OuterVolumeSpecName: "kube-api-access-x48gj") pod "b7fe1c6e-0858-479f-b365-081a1b8fcf2d" (UID: "b7fe1c6e-0858-479f-b365-081a1b8fcf2d"). InnerVolumeSpecName "kube-api-access-x48gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.676649 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-scripts" (OuterVolumeSpecName: "scripts") pod "b7fe1c6e-0858-479f-b365-081a1b8fcf2d" (UID: "b7fe1c6e-0858-479f-b365-081a1b8fcf2d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.676893 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6748bae8-dcab-4fdb-ab49-b60893908a7f-kube-api-access-qmt8w" (OuterVolumeSpecName: "kube-api-access-qmt8w") pod "6748bae8-dcab-4fdb-ab49-b60893908a7f" (UID: "6748bae8-dcab-4fdb-ab49-b60893908a7f"). InnerVolumeSpecName "kube-api-access-qmt8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.678766 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b7fe1c6e-0858-479f-b365-081a1b8fcf2d" (UID: "b7fe1c6e-0858-479f-b365-081a1b8fcf2d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.709236 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6748bae8-dcab-4fdb-ab49-b60893908a7f-config" (OuterVolumeSpecName: "config") pod "6748bae8-dcab-4fdb-ab49-b60893908a7f" (UID: "6748bae8-dcab-4fdb-ab49-b60893908a7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.712566 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7fe1c6e-0858-479f-b365-081a1b8fcf2d" (UID: "b7fe1c6e-0858-479f-b365-081a1b8fcf2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.713734 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6748bae8-dcab-4fdb-ab49-b60893908a7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6748bae8-dcab-4fdb-ab49-b60893908a7f" (UID: "6748bae8-dcab-4fdb-ab49-b60893908a7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.741759 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-config-data" (OuterVolumeSpecName: "config-data") pod "b7fe1c6e-0858-479f-b365-081a1b8fcf2d" (UID: "b7fe1c6e-0858-479f-b365-081a1b8fcf2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.775675 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.775748 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmt8w\" (UniqueName: \"kubernetes.io/projected/6748bae8-dcab-4fdb-ab49-b60893908a7f-kube-api-access-qmt8w\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.775765 4914 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.775777 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x48gj\" (UniqueName: \"kubernetes.io/projected/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-kube-api-access-x48gj\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.775788 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6748bae8-dcab-4fdb-ab49-b60893908a7f-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.775798 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.775808 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6748bae8-dcab-4fdb-ab49-b60893908a7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:02 crc kubenswrapper[4914]: I0130 21:35:02.775817 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fe1c6e-0858-479f-b365-081a1b8fcf2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.066915 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-46tqv" event={"ID":"6748bae8-dcab-4fdb-ab49-b60893908a7f","Type":"ContainerDied","Data":"8b57b201de81879e978a54221bce597132b6e15ab5017570185d67e7f3f865d1"} Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.067294 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b57b201de81879e978a54221bce597132b6e15ab5017570185d67e7f3f865d1" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.067391 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-46tqv" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.079296 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f3f352-1ffb-48b4-b985-0d2d2206c7c1","Type":"ContainerStarted","Data":"f6e9b1492fb6eec2a0336ad396bfbe2c8616fcfd57fbfb1adb2f3b85f1e2e7a5"} Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.094090 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6kskl" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.096830 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6kskl" event={"ID":"b7fe1c6e-0858-479f-b365-081a1b8fcf2d","Type":"ContainerDied","Data":"1ab7c073dc13824bb150dc0eeb0af2511ed4618d4a5cff972d84bd32055f5b4e"} Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.096883 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ab7c073dc13824bb150dc0eeb0af2511ed4618d4a5cff972d84bd32055f5b4e" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.225314 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-s7pdh"] Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.225534 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" podUID="db65d505-10ef-4668-9e71-f4faa42d4915" containerName="dnsmasq-dns" containerID="cri-o://53e501207acc98be1b003614fbe2daefa596b9824a980ced3eba65d7d7c9eb38" gracePeriod=10 Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.234907 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.373914 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-p52ll"] Jan 30 21:35:03 crc kubenswrapper[4914]: E0130 21:35:03.374276 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6748bae8-dcab-4fdb-ab49-b60893908a7f" containerName="neutron-db-sync" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.374287 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6748bae8-dcab-4fdb-ab49-b60893908a7f" containerName="neutron-db-sync" Jan 30 21:35:03 crc kubenswrapper[4914]: E0130 21:35:03.374306 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7fe1c6e-0858-479f-b365-081a1b8fcf2d" containerName="cinder-db-sync" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.374312 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7fe1c6e-0858-479f-b365-081a1b8fcf2d" containerName="cinder-db-sync" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.374480 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6748bae8-dcab-4fdb-ab49-b60893908a7f" containerName="neutron-db-sync" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.374488 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7fe1c6e-0858-479f-b365-081a1b8fcf2d" containerName="cinder-db-sync" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.375524 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.406931 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-p52ll"] Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.494184 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.494317 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.494366 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.494441 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-config\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.494469 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-dns-svc\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.494511 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b25c\" (UniqueName: \"kubernetes.io/projected/4a62f780-aee7-4b2b-95e8-39951fcc3dea-kube-api-access-7b25c\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.582541 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.584287 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.595310 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.595584 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wcgnb" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.595740 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.597012 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.597225 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.597296 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-config\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.597324 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-dns-svc\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.597347 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b25c\" (UniqueName: \"kubernetes.io/projected/4a62f780-aee7-4b2b-95e8-39951fcc3dea-kube-api-access-7b25c\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.597370 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.598390 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.598966 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.599455 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.599716 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-dns-svc\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.600186 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-config\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.611765 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.632665 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.700649 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.700689 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.700760 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdgxz\" (UniqueName: \"kubernetes.io/projected/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-kube-api-access-rdgxz\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.700848 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-scripts\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.700890 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-config-data\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.700923 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.726713 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b25c\" (UniqueName: \"kubernetes.io/projected/4a62f780-aee7-4b2b-95e8-39951fcc3dea-kube-api-access-7b25c\") pod \"dnsmasq-dns-85ff748b95-p52ll\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.805059 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdgxz\" (UniqueName: \"kubernetes.io/projected/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-kube-api-access-rdgxz\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.805164 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-scripts\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.805201 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-config-data\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.805236 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.805278 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.805299 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.820685 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-scripts\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.828268 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-config-data\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.828868 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.832889 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.837632 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdgxz\" (UniqueName: \"kubernetes.io/projected/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-kube-api-access-rdgxz\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.853257 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.907732 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-p52ll"] Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.908330 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5789d46bdd-5kscc"] Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.914422 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5789d46bdd-5kscc"] Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.914450 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gpjkz"] Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.916970 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gpjkz"] Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.917001 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.918955 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.919952 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.920249 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.922834 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.936268 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.936302 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.937197 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wkslc" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.937327 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 30 21:35:03 crc kubenswrapper[4914]: I0130 21:35:03.937442 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.015308 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ec90be3-5dfd-48aa-934c-70ef856a51c5-logs\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.015347 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.015372 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-scripts\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.016403 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82fj2\" (UniqueName: \"kubernetes.io/projected/af8dbc06-6b83-49c0-9413-56a90165fb97-kube-api-access-82fj2\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.016437 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ec90be3-5dfd-48aa-934c-70ef856a51c5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.016465 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.016498 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-config-data-custom\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.016583 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-ovndb-tls-certs\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.016609 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-httpd-config\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.016648 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.016669 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-combined-ca-bundle\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.016696 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-config-data\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.016754 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm2rb\" (UniqueName: \"kubernetes.io/projected/3b59a0a4-c433-4c7f-a789-b84c19ce532c-kube-api-access-wm2rb\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.016791 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkz2n\" (UniqueName: \"kubernetes.io/projected/1ec90be3-5dfd-48aa-934c-70ef856a51c5-kube-api-access-nkz2n\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.016814 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-config\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.016846 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.018061 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-config\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.018115 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.079488 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.105234 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.107356 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.113772 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-t4kd2" event={"ID":"a0548f63-8249-4708-88d9-b3f663b28778","Type":"ContainerDied","Data":"76ca29269c6439ffd5817368ade8500ca4d0ecf77684033d3cbbf8029b6ed3b4"} Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.113807 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76ca29269c6439ffd5817368ade8500ca4d0ecf77684033d3cbbf8029b6ed3b4" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.113837 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-t4kd2" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.117692 4914 generic.go:334] "Generic (PLEG): container finished" podID="db65d505-10ef-4668-9e71-f4faa42d4915" containerID="53e501207acc98be1b003614fbe2daefa596b9824a980ced3eba65d7d7c9eb38" exitCode=0 Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.117743 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" event={"ID":"db65d505-10ef-4668-9e71-f4faa42d4915","Type":"ContainerDied","Data":"53e501207acc98be1b003614fbe2daefa596b9824a980ced3eba65d7d7c9eb38"} Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.119756 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-ovndb-tls-certs\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.119786 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-httpd-config\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.119810 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.119825 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-combined-ca-bundle\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.119843 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-config-data\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.119871 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm2rb\" (UniqueName: \"kubernetes.io/projected/3b59a0a4-c433-4c7f-a789-b84c19ce532c-kube-api-access-wm2rb\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.119894 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkz2n\" (UniqueName: \"kubernetes.io/projected/1ec90be3-5dfd-48aa-934c-70ef856a51c5-kube-api-access-nkz2n\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.119912 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-config\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.119933 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.120733 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-config\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.120764 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.120797 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ec90be3-5dfd-48aa-934c-70ef856a51c5-logs\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.123776 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.123851 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-scripts\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.123920 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82fj2\" (UniqueName: \"kubernetes.io/projected/af8dbc06-6b83-49c0-9413-56a90165fb97-kube-api-access-82fj2\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.123943 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ec90be3-5dfd-48aa-934c-70ef856a51c5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.123985 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.124033 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-config-data-custom\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.129461 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ec90be3-5dfd-48aa-934c-70ef856a51c5-logs\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.130987 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-config\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.135306 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.135799 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-httpd-config\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.136680 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.139082 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ec90be3-5dfd-48aa-934c-70ef856a51c5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.139544 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-scripts\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.142855 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-ovndb-tls-certs\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.142909 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.143507 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.144909 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm2rb\" (UniqueName: \"kubernetes.io/projected/3b59a0a4-c433-4c7f-a789-b84c19ce532c-kube-api-access-wm2rb\") pod \"dnsmasq-dns-5c9776ccc5-gpjkz\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.146790 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-config\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.148461 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-combined-ca-bundle\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.150003 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.153427 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-config-data\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.153932 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82fj2\" (UniqueName: \"kubernetes.io/projected/af8dbc06-6b83-49c0-9413-56a90165fb97-kube-api-access-82fj2\") pod \"neutron-5789d46bdd-5kscc\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.167780 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-config-data-custom\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.178399 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkz2n\" (UniqueName: \"kubernetes.io/projected/1ec90be3-5dfd-48aa-934c-70ef856a51c5-kube-api-access-nkz2n\") pod \"cinder-api-0\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.225761 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a0548f63-8249-4708-88d9-b3f663b28778-certs\") pod \"a0548f63-8249-4708-88d9-b3f663b28778\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.225837 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-config-data\") pod \"a0548f63-8249-4708-88d9-b3f663b28778\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.225863 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-scripts\") pod \"a0548f63-8249-4708-88d9-b3f663b28778\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.225979 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knbtb\" (UniqueName: \"kubernetes.io/projected/a0548f63-8249-4708-88d9-b3f663b28778-kube-api-access-knbtb\") pod \"a0548f63-8249-4708-88d9-b3f663b28778\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.226037 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-combined-ca-bundle\") pod \"a0548f63-8249-4708-88d9-b3f663b28778\" (UID: \"a0548f63-8249-4708-88d9-b3f663b28778\") " Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.235058 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-scripts" (OuterVolumeSpecName: "scripts") pod "a0548f63-8249-4708-88d9-b3f663b28778" (UID: "a0548f63-8249-4708-88d9-b3f663b28778"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.240269 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0548f63-8249-4708-88d9-b3f663b28778-kube-api-access-knbtb" (OuterVolumeSpecName: "kube-api-access-knbtb") pod "a0548f63-8249-4708-88d9-b3f663b28778" (UID: "a0548f63-8249-4708-88d9-b3f663b28778"). InnerVolumeSpecName "kube-api-access-knbtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.246424 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0548f63-8249-4708-88d9-b3f663b28778-certs" (OuterVolumeSpecName: "certs") pod "a0548f63-8249-4708-88d9-b3f663b28778" (UID: "a0548f63-8249-4708-88d9-b3f663b28778"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.261089 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0548f63-8249-4708-88d9-b3f663b28778" (UID: "a0548f63-8249-4708-88d9-b3f663b28778"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.303770 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.319882 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-config-data" (OuterVolumeSpecName: "config-data") pod "a0548f63-8249-4708-88d9-b3f663b28778" (UID: "a0548f63-8249-4708-88d9-b3f663b28778"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.326371 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.329379 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.329413 4914 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a0548f63-8249-4708-88d9-b3f663b28778-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.329422 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.329433 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0548f63-8249-4708-88d9-b3f663b28778-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.329441 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knbtb\" (UniqueName: \"kubernetes.io/projected/a0548f63-8249-4708-88d9-b3f663b28778-kube-api-access-knbtb\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.336994 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.535954 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" podUID="db65d505-10ef-4668-9e71-f4faa42d4915" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.175:5353: connect: connection refused" Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.735290 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-p52ll"] Jan 30 21:35:04 crc kubenswrapper[4914]: W0130 21:35:04.761058 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a62f780_aee7_4b2b_95e8_39951fcc3dea.slice/crio-f638b44d9237875d7e30661c79a3e8a364b3dc08dd4be1af8451f139749f4844 WatchSource:0}: Error finding container f638b44d9237875d7e30661c79a3e8a364b3dc08dd4be1af8451f139749f4844: Status 404 returned error can't find the container with id f638b44d9237875d7e30661c79a3e8a364b3dc08dd4be1af8451f139749f4844 Jan 30 21:35:04 crc kubenswrapper[4914]: I0130 21:35:04.877753 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:35:04 crc kubenswrapper[4914]: W0130 21:35:04.881612 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e6778a2_f469_4b92_b7d1_a5545b9f9ed0.slice/crio-e76f8fed8850259dc4e6c89cc7a6fe3c2903f27a9f539c18998471bad6e1c5af WatchSource:0}: Error finding container e76f8fed8850259dc4e6c89cc7a6fe3c2903f27a9f539c18998471bad6e1c5af: Status 404 returned error can't find the container with id e76f8fed8850259dc4e6c89cc7a6fe3c2903f27a9f539c18998471bad6e1c5af Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.037519 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.136217 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-p52ll" event={"ID":"4a62f780-aee7-4b2b-95e8-39951fcc3dea","Type":"ContainerStarted","Data":"f638b44d9237875d7e30661c79a3e8a364b3dc08dd4be1af8451f139749f4844"} Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.145512 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f3f352-1ffb-48b4-b985-0d2d2206c7c1","Type":"ContainerStarted","Data":"237ebdc66e6ff43178fee12f856d64b48e911d8986a371896cc650ecdc3809c8"} Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.146636 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0","Type":"ContainerStarted","Data":"e76f8fed8850259dc4e6c89cc7a6fe3c2903f27a9f539c18998471bad6e1c5af"} Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.147888 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" event={"ID":"db65d505-10ef-4668-9e71-f4faa42d4915","Type":"ContainerDied","Data":"327bfabfc70ba9a75011460e48f0463a19b1384ac47327912a0fa996fa51d2f9"} Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.147918 4914 scope.go:117] "RemoveContainer" containerID="53e501207acc98be1b003614fbe2daefa596b9824a980ced3eba65d7d7c9eb38" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.148074 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-s7pdh" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.156132 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-dns-svc\") pod \"db65d505-10ef-4668-9e71-f4faa42d4915\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.156176 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-ovsdbserver-nb\") pod \"db65d505-10ef-4668-9e71-f4faa42d4915\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.156248 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc49l\" (UniqueName: \"kubernetes.io/projected/db65d505-10ef-4668-9e71-f4faa42d4915-kube-api-access-sc49l\") pod \"db65d505-10ef-4668-9e71-f4faa42d4915\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.156321 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-dns-swift-storage-0\") pod \"db65d505-10ef-4668-9e71-f4faa42d4915\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.156396 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-config\") pod \"db65d505-10ef-4668-9e71-f4faa42d4915\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.156426 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-ovsdbserver-sb\") pod \"db65d505-10ef-4668-9e71-f4faa42d4915\" (UID: \"db65d505-10ef-4668-9e71-f4faa42d4915\") " Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.178169 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db65d505-10ef-4668-9e71-f4faa42d4915-kube-api-access-sc49l" (OuterVolumeSpecName: "kube-api-access-sc49l") pod "db65d505-10ef-4668-9e71-f4faa42d4915" (UID: "db65d505-10ef-4668-9e71-f4faa42d4915"). InnerVolumeSpecName "kube-api-access-sc49l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.208091 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5789d46bdd-5kscc"] Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.258272 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc49l\" (UniqueName: \"kubernetes.io/projected/db65d505-10ef-4668-9e71-f4faa42d4915-kube-api-access-sc49l\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.265861 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.279792 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-m6cp5"] Jan 30 21:35:05 crc kubenswrapper[4914]: E0130 21:35:05.280296 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db65d505-10ef-4668-9e71-f4faa42d4915" containerName="init" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.280320 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="db65d505-10ef-4668-9e71-f4faa42d4915" containerName="init" Jan 30 21:35:05 crc kubenswrapper[4914]: E0130 21:35:05.280340 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db65d505-10ef-4668-9e71-f4faa42d4915" containerName="dnsmasq-dns" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.280349 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="db65d505-10ef-4668-9e71-f4faa42d4915" containerName="dnsmasq-dns" Jan 30 21:35:05 crc kubenswrapper[4914]: E0130 21:35:05.280371 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0548f63-8249-4708-88d9-b3f663b28778" containerName="cloudkitty-db-sync" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.280381 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0548f63-8249-4708-88d9-b3f663b28778" containerName="cloudkitty-db-sync" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.280551 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="db65d505-10ef-4668-9e71-f4faa42d4915" containerName="dnsmasq-dns" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.280574 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0548f63-8249-4708-88d9-b3f663b28778" containerName="cloudkitty-db-sync" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.281336 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.286383 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gpjkz"] Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.291291 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.291490 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-tfj5s" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.291684 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.291810 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.291916 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.302274 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-m6cp5"] Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.360984 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-combined-ca-bundle\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.361038 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-scripts\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.361258 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/60b23541-8035-417c-8ea6-69cf3c0b2758-certs\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.361376 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cmv6\" (UniqueName: \"kubernetes.io/projected/60b23541-8035-417c-8ea6-69cf3c0b2758-kube-api-access-8cmv6\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.361419 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-config-data\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.499697 4914 scope.go:117] "RemoveContainer" containerID="e1e53143c3b159e9406fb36aa32a117e13115226bfffe7c0e8aa69db54302785" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.516917 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/60b23541-8035-417c-8ea6-69cf3c0b2758-certs\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.516978 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cmv6\" (UniqueName: \"kubernetes.io/projected/60b23541-8035-417c-8ea6-69cf3c0b2758-kube-api-access-8cmv6\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.517003 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-config-data\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.517067 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-combined-ca-bundle\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.517086 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-scripts\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.550559 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "db65d505-10ef-4668-9e71-f4faa42d4915" (UID: "db65d505-10ef-4668-9e71-f4faa42d4915"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.557271 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "db65d505-10ef-4668-9e71-f4faa42d4915" (UID: "db65d505-10ef-4668-9e71-f4faa42d4915"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.558789 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-config-data\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.558961 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-combined-ca-bundle\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.565148 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-scripts\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.568204 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cmv6\" (UniqueName: \"kubernetes.io/projected/60b23541-8035-417c-8ea6-69cf3c0b2758-kube-api-access-8cmv6\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.569653 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/60b23541-8035-417c-8ea6-69cf3c0b2758-certs\") pod \"cloudkitty-storageinit-m6cp5\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.570090 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "db65d505-10ef-4668-9e71-f4faa42d4915" (UID: "db65d505-10ef-4668-9e71-f4faa42d4915"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.574379 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.623934 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "db65d505-10ef-4668-9e71-f4faa42d4915" (UID: "db65d505-10ef-4668-9e71-f4faa42d4915"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.646008 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.646293 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.646339 4914 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.646355 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.655764 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-config" (OuterVolumeSpecName: "config") pod "db65d505-10ef-4668-9e71-f4faa42d4915" (UID: "db65d505-10ef-4668-9e71-f4faa42d4915"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.747958 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db65d505-10ef-4668-9e71-f4faa42d4915-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.868767 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-s7pdh"] Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.868805 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-s7pdh"] Jan 30 21:35:05 crc kubenswrapper[4914]: I0130 21:35:05.869677 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:06 crc kubenswrapper[4914]: I0130 21:35:06.170919 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" event={"ID":"3b59a0a4-c433-4c7f-a789-b84c19ce532c","Type":"ContainerStarted","Data":"7a1b6a576e77249deff47171ff3f0155744d68013a0ee7021611106af4fdc728"} Jan 30 21:35:06 crc kubenswrapper[4914]: I0130 21:35:06.173299 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-p52ll" event={"ID":"4a62f780-aee7-4b2b-95e8-39951fcc3dea","Type":"ContainerStarted","Data":"1dba185fca64240cc9714abd29c56b5d5795b7eb5fa83971bbcb87dc529eb53d"} Jan 30 21:35:06 crc kubenswrapper[4914]: I0130 21:35:06.176620 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1ec90be3-5dfd-48aa-934c-70ef856a51c5","Type":"ContainerStarted","Data":"b7ac8414b0072fad275b965fb0dfb37793cf06c9b24a703a9efa63fd2f4ac6fe"} Jan 30 21:35:06 crc kubenswrapper[4914]: I0130 21:35:06.178296 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5789d46bdd-5kscc" event={"ID":"af8dbc06-6b83-49c0-9413-56a90165fb97","Type":"ContainerStarted","Data":"72c8415d9ec3f5921dc4284d76a85723e413ce9cadcc3a2715ef55d4b919fff2"} Jan 30 21:35:06 crc kubenswrapper[4914]: I0130 21:35:06.529435 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-m6cp5"] Jan 30 21:35:06 crc kubenswrapper[4914]: W0130 21:35:06.586717 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60b23541_8035_417c_8ea6_69cf3c0b2758.slice/crio-755b0f969cfdb1520a88cf0e967e08b00ed393c377024170fed685a29bb85780 WatchSource:0}: Error finding container 755b0f969cfdb1520a88cf0e967e08b00ed393c377024170fed685a29bb85780: Status 404 returned error can't find the container with id 755b0f969cfdb1520a88cf0e967e08b00ed393c377024170fed685a29bb85780 Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.096532 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.228215 4914 generic.go:334] "Generic (PLEG): container finished" podID="3b59a0a4-c433-4c7f-a789-b84c19ce532c" containerID="585de9fe657511be774b9cac30c695df39751c4385d1c61e48dd4303aa9c27ce" exitCode=0 Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.228342 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" event={"ID":"3b59a0a4-c433-4c7f-a789-b84c19ce532c","Type":"ContainerDied","Data":"585de9fe657511be774b9cac30c695df39751c4385d1c61e48dd4303aa9c27ce"} Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.256850 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1ec90be3-5dfd-48aa-934c-70ef856a51c5","Type":"ContainerStarted","Data":"0a050a81069911fee324bcfa300600734ba13e068a4cfdd947f441e91740437d"} Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.265163 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5789d46bdd-5kscc" event={"ID":"af8dbc06-6b83-49c0-9413-56a90165fb97","Type":"ContainerStarted","Data":"0195835ed917d263a825d5070d67bbb21d31e2b7db9444936980af6a4d7a30e6"} Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.269072 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-m6cp5" event={"ID":"60b23541-8035-417c-8ea6-69cf3c0b2758","Type":"ContainerStarted","Data":"755b0f969cfdb1520a88cf0e967e08b00ed393c377024170fed685a29bb85780"} Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.277110 4914 generic.go:334] "Generic (PLEG): container finished" podID="4a62f780-aee7-4b2b-95e8-39951fcc3dea" containerID="1dba185fca64240cc9714abd29c56b5d5795b7eb5fa83971bbcb87dc529eb53d" exitCode=0 Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.277152 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-p52ll" event={"ID":"4a62f780-aee7-4b2b-95e8-39951fcc3dea","Type":"ContainerDied","Data":"1dba185fca64240cc9714abd29c56b5d5795b7eb5fa83971bbcb87dc529eb53d"} Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.335791 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.879653 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.888470 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db65d505-10ef-4668-9e71-f4faa42d4915" path="/var/lib/kubelet/pods/db65d505-10ef-4668-9e71-f4faa42d4915/volumes" Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.952400 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-config\") pod \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.952507 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-ovsdbserver-sb\") pod \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.952543 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-dns-swift-storage-0\") pod \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.952569 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-ovsdbserver-nb\") pod \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.952587 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-dns-svc\") pod \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.952729 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b25c\" (UniqueName: \"kubernetes.io/projected/4a62f780-aee7-4b2b-95e8-39951fcc3dea-kube-api-access-7b25c\") pod \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\" (UID: \"4a62f780-aee7-4b2b-95e8-39951fcc3dea\") " Jan 30 21:35:07 crc kubenswrapper[4914]: I0130 21:35:07.973200 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a62f780-aee7-4b2b-95e8-39951fcc3dea-kube-api-access-7b25c" (OuterVolumeSpecName: "kube-api-access-7b25c") pod "4a62f780-aee7-4b2b-95e8-39951fcc3dea" (UID: "4a62f780-aee7-4b2b-95e8-39951fcc3dea"). InnerVolumeSpecName "kube-api-access-7b25c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.007952 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-config" (OuterVolumeSpecName: "config") pod "4a62f780-aee7-4b2b-95e8-39951fcc3dea" (UID: "4a62f780-aee7-4b2b-95e8-39951fcc3dea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.008566 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a62f780-aee7-4b2b-95e8-39951fcc3dea" (UID: "4a62f780-aee7-4b2b-95e8-39951fcc3dea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.030191 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a62f780-aee7-4b2b-95e8-39951fcc3dea" (UID: "4a62f780-aee7-4b2b-95e8-39951fcc3dea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.039573 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4a62f780-aee7-4b2b-95e8-39951fcc3dea" (UID: "4a62f780-aee7-4b2b-95e8-39951fcc3dea"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.039745 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4a62f780-aee7-4b2b-95e8-39951fcc3dea" (UID: "4a62f780-aee7-4b2b-95e8-39951fcc3dea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.055998 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b25c\" (UniqueName: \"kubernetes.io/projected/4a62f780-aee7-4b2b-95e8-39951fcc3dea-kube-api-access-7b25c\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.056032 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.056046 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.056055 4914 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.056064 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.056072 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a62f780-aee7-4b2b-95e8-39951fcc3dea-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.306912 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5789d46bdd-5kscc" event={"ID":"af8dbc06-6b83-49c0-9413-56a90165fb97","Type":"ContainerStarted","Data":"cd9fa6fd0ba56738d050e4726d6578839bcda9c1b3fee5ae13e7dd973ed8100a"} Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.308388 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.319438 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-m6cp5" event={"ID":"60b23541-8035-417c-8ea6-69cf3c0b2758","Type":"ContainerStarted","Data":"1cb7e0c8338c4e43b8e92ecacd04c310d38d3d5f776f226e015397c968c01590"} Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.336557 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5789d46bdd-5kscc" podStartSLOduration=5.33653885 podStartE2EDuration="5.33653885s" podCreationTimestamp="2026-01-30 21:35:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:08.322476593 +0000 UTC m=+1241.761113344" watchObservedRunningTime="2026-01-30 21:35:08.33653885 +0000 UTC m=+1241.775175611" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.338376 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-p52ll" event={"ID":"4a62f780-aee7-4b2b-95e8-39951fcc3dea","Type":"ContainerDied","Data":"f638b44d9237875d7e30661c79a3e8a364b3dc08dd4be1af8451f139749f4844"} Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.338426 4914 scope.go:117] "RemoveContainer" containerID="1dba185fca64240cc9714abd29c56b5d5795b7eb5fa83971bbcb87dc529eb53d" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.338554 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-p52ll" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.345744 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-m6cp5" podStartSLOduration=3.3457261 podStartE2EDuration="3.3457261s" podCreationTimestamp="2026-01-30 21:35:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:08.338914607 +0000 UTC m=+1241.777551368" watchObservedRunningTime="2026-01-30 21:35:08.3457261 +0000 UTC m=+1241.784362861" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.346625 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" event={"ID":"3b59a0a4-c433-4c7f-a789-b84c19ce532c","Type":"ContainerStarted","Data":"6328dc5f4ff94d585fd3046f699039c93a02416277657d4dca3d9cc90eba6f45"} Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.347353 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.362327 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f3f352-1ffb-48b4-b985-0d2d2206c7c1","Type":"ContainerStarted","Data":"6477d6d8d2aa0dff14d1edc8986a5f8e4b34cfe14d6f442480c3a05b21d7ff04"} Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.368225 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1ec90be3-5dfd-48aa-934c-70ef856a51c5","Type":"ContainerStarted","Data":"947c4b37c2af5121b8ffad8108987b966bc66f3d7f4b14041a26c5e61078604c"} Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.368734 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.368605 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1ec90be3-5dfd-48aa-934c-70ef856a51c5" containerName="cinder-api" containerID="cri-o://947c4b37c2af5121b8ffad8108987b966bc66f3d7f4b14041a26c5e61078604c" gracePeriod=30 Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.368455 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1ec90be3-5dfd-48aa-934c-70ef856a51c5" containerName="cinder-api-log" containerID="cri-o://0a050a81069911fee324bcfa300600734ba13e068a4cfdd947f441e91740437d" gracePeriod=30 Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.376434 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" podStartSLOduration=5.376416486 podStartE2EDuration="5.376416486s" podCreationTimestamp="2026-01-30 21:35:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:08.364266335 +0000 UTC m=+1241.802903096" watchObservedRunningTime="2026-01-30 21:35:08.376416486 +0000 UTC m=+1241.815053247" Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.377887 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0","Type":"ContainerStarted","Data":"b0623e4ff177438bdaafaf776dfa21dd7415f8d0d183ba8f5e4aaf63fc915fc9"} Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.402748 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-p52ll"] Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.432169 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-p52ll"] Jan 30 21:35:08 crc kubenswrapper[4914]: I0130 21:35:08.441362 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.441343022 podStartE2EDuration="5.441343022s" podCreationTimestamp="2026-01-30 21:35:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:08.423260918 +0000 UTC m=+1241.861897679" watchObservedRunningTime="2026-01-30 21:35:08.441343022 +0000 UTC m=+1241.879979783" Jan 30 21:35:09 crc kubenswrapper[4914]: I0130 21:35:09.389875 4914 generic.go:334] "Generic (PLEG): container finished" podID="1ec90be3-5dfd-48aa-934c-70ef856a51c5" containerID="0a050a81069911fee324bcfa300600734ba13e068a4cfdd947f441e91740437d" exitCode=143 Jan 30 21:35:09 crc kubenswrapper[4914]: I0130 21:35:09.390138 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1ec90be3-5dfd-48aa-934c-70ef856a51c5","Type":"ContainerDied","Data":"0a050a81069911fee324bcfa300600734ba13e068a4cfdd947f441e91740437d"} Jan 30 21:35:09 crc kubenswrapper[4914]: I0130 21:35:09.393351 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0","Type":"ContainerStarted","Data":"2490b8d2ba5ad29849bea75befe9025b00558fff6f0d48f6425fb0c1c658116c"} Jan 30 21:35:09 crc kubenswrapper[4914]: I0130 21:35:09.415779 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.328100355 podStartE2EDuration="6.415763892s" podCreationTimestamp="2026-01-30 21:35:03 +0000 UTC" firstStartedPulling="2026-01-30 21:35:04.889216351 +0000 UTC m=+1238.327853112" lastFinishedPulling="2026-01-30 21:35:06.976879888 +0000 UTC m=+1240.415516649" observedRunningTime="2026-01-30 21:35:09.410587968 +0000 UTC m=+1242.849224729" watchObservedRunningTime="2026-01-30 21:35:09.415763892 +0000 UTC m=+1242.854400653" Jan 30 21:35:09 crc kubenswrapper[4914]: I0130 21:35:09.828344 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a62f780-aee7-4b2b-95e8-39951fcc3dea" path="/var/lib/kubelet/pods/4a62f780-aee7-4b2b-95e8-39951fcc3dea/volumes" Jan 30 21:35:09 crc kubenswrapper[4914]: I0130 21:35:09.871458 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-594584649-k6kdl"] Jan 30 21:35:09 crc kubenswrapper[4914]: E0130 21:35:09.871857 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a62f780-aee7-4b2b-95e8-39951fcc3dea" containerName="init" Jan 30 21:35:09 crc kubenswrapper[4914]: I0130 21:35:09.871873 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a62f780-aee7-4b2b-95e8-39951fcc3dea" containerName="init" Jan 30 21:35:09 crc kubenswrapper[4914]: I0130 21:35:09.872046 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a62f780-aee7-4b2b-95e8-39951fcc3dea" containerName="init" Jan 30 21:35:09 crc kubenswrapper[4914]: I0130 21:35:09.874872 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:09 crc kubenswrapper[4914]: I0130 21:35:09.876674 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 30 21:35:09 crc kubenswrapper[4914]: I0130 21:35:09.877367 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 30 21:35:09 crc kubenswrapper[4914]: I0130 21:35:09.902943 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-594584649-k6kdl"] Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.022866 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.026666 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-config\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.026761 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbqvn\" (UniqueName: \"kubernetes.io/projected/60c348bd-a7cd-4220-aad6-39e33a8b3649-kube-api-access-hbqvn\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.026813 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-public-tls-certs\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.026989 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-ovndb-tls-certs\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.027087 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-combined-ca-bundle\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.027112 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-httpd-config\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.027250 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-internal-tls-certs\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.129554 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-config\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.129604 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-public-tls-certs\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.129622 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbqvn\" (UniqueName: \"kubernetes.io/projected/60c348bd-a7cd-4220-aad6-39e33a8b3649-kube-api-access-hbqvn\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.129676 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-ovndb-tls-certs\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.129851 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-combined-ca-bundle\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.129871 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-httpd-config\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.129909 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-internal-tls-certs\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.136144 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-ovndb-tls-certs\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.137426 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-public-tls-certs\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.140434 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-combined-ca-bundle\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.150791 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbqvn\" (UniqueName: \"kubernetes.io/projected/60c348bd-a7cd-4220-aad6-39e33a8b3649-kube-api-access-hbqvn\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.154527 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-config\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.154618 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-httpd-config\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.158577 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c348bd-a7cd-4220-aad6-39e33a8b3649-internal-tls-certs\") pod \"neutron-594584649-k6kdl\" (UID: \"60c348bd-a7cd-4220-aad6-39e33a8b3649\") " pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.190659 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.266007 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-975f98546-d2x5z" Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.346651 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-569449474-v9hfx"] Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.347757 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-569449474-v9hfx" podUID="c2d0533c-95eb-41ee-af58-d85e832a41d3" containerName="barbican-api-log" containerID="cri-o://d744e6181492333e1babbb97873cfb973aa9345927d3f77bd5a539f61709e2d5" gracePeriod=30 Jan 30 21:35:10 crc kubenswrapper[4914]: I0130 21:35:10.347868 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-569449474-v9hfx" podUID="c2d0533c-95eb-41ee-af58-d85e832a41d3" containerName="barbican-api" containerID="cri-o://09db82c5b7273909327fe13b74f0eac9fe7953343eefb01fd9a29b44cec5adba" gracePeriod=30 Jan 30 21:35:11 crc kubenswrapper[4914]: I0130 21:35:11.142418 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-594584649-k6kdl"] Jan 30 21:35:11 crc kubenswrapper[4914]: I0130 21:35:11.428910 4914 generic.go:334] "Generic (PLEG): container finished" podID="c2d0533c-95eb-41ee-af58-d85e832a41d3" containerID="d744e6181492333e1babbb97873cfb973aa9345927d3f77bd5a539f61709e2d5" exitCode=143 Jan 30 21:35:11 crc kubenswrapper[4914]: I0130 21:35:11.428992 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-569449474-v9hfx" event={"ID":"c2d0533c-95eb-41ee-af58-d85e832a41d3","Type":"ContainerDied","Data":"d744e6181492333e1babbb97873cfb973aa9345927d3f77bd5a539f61709e2d5"} Jan 30 21:35:11 crc kubenswrapper[4914]: I0130 21:35:11.446640 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f3f352-1ffb-48b4-b985-0d2d2206c7c1","Type":"ContainerStarted","Data":"e003dae359bf8c1ce30cd060252e6599786b150459d0ae7eb07985a3459eb48d"} Jan 30 21:35:11 crc kubenswrapper[4914]: I0130 21:35:11.446959 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 21:35:11 crc kubenswrapper[4914]: I0130 21:35:11.451856 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-594584649-k6kdl" event={"ID":"60c348bd-a7cd-4220-aad6-39e33a8b3649","Type":"ContainerStarted","Data":"6d8978c73d364bd1251caa94119ce252f77f0ad83bf565522dddccb8d3ac3335"} Jan 30 21:35:11 crc kubenswrapper[4914]: I0130 21:35:11.492252 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.194796165 podStartE2EDuration="11.492237551s" podCreationTimestamp="2026-01-30 21:35:00 +0000 UTC" firstStartedPulling="2026-01-30 21:35:01.149340412 +0000 UTC m=+1234.587977173" lastFinishedPulling="2026-01-30 21:35:10.446781798 +0000 UTC m=+1243.885418559" observedRunningTime="2026-01-30 21:35:11.488982563 +0000 UTC m=+1244.927619324" watchObservedRunningTime="2026-01-30 21:35:11.492237551 +0000 UTC m=+1244.930874312" Jan 30 21:35:12 crc kubenswrapper[4914]: I0130 21:35:12.472691 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-594584649-k6kdl" event={"ID":"60c348bd-a7cd-4220-aad6-39e33a8b3649","Type":"ContainerStarted","Data":"dac3e620a151b3760bfed8bcd4cb7c6f606d8f4ad7c2aa20ef3c5df0a23e1d18"} Jan 30 21:35:12 crc kubenswrapper[4914]: I0130 21:35:12.472994 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-594584649-k6kdl" event={"ID":"60c348bd-a7cd-4220-aad6-39e33a8b3649","Type":"ContainerStarted","Data":"097075fe632e1b401626bc74159436bdbcf023a86e82b8e50e72d3ae725a0e33"} Jan 30 21:35:12 crc kubenswrapper[4914]: I0130 21:35:12.473008 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:12 crc kubenswrapper[4914]: I0130 21:35:12.511830 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-594584649-k6kdl" podStartSLOduration=3.511798713 podStartE2EDuration="3.511798713s" podCreationTimestamp="2026-01-30 21:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:12.491405654 +0000 UTC m=+1245.930042415" watchObservedRunningTime="2026-01-30 21:35:12.511798713 +0000 UTC m=+1245.950435484" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.073301 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.106451 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.147155 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.153154 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7bf9bcb7dd-cl45x" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.175025 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.218266 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-config-data\") pod \"c2d0533c-95eb-41ee-af58-d85e832a41d3\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.218373 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-combined-ca-bundle\") pod \"c2d0533c-95eb-41ee-af58-d85e832a41d3\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.218498 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvqtl\" (UniqueName: \"kubernetes.io/projected/c2d0533c-95eb-41ee-af58-d85e832a41d3-kube-api-access-gvqtl\") pod \"c2d0533c-95eb-41ee-af58-d85e832a41d3\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.218523 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-config-data-custom\") pod \"c2d0533c-95eb-41ee-af58-d85e832a41d3\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.219144 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2d0533c-95eb-41ee-af58-d85e832a41d3-logs\") pod \"c2d0533c-95eb-41ee-af58-d85e832a41d3\" (UID: \"c2d0533c-95eb-41ee-af58-d85e832a41d3\") " Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.220846 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2d0533c-95eb-41ee-af58-d85e832a41d3-logs" (OuterVolumeSpecName: "logs") pod "c2d0533c-95eb-41ee-af58-d85e832a41d3" (UID: "c2d0533c-95eb-41ee-af58-d85e832a41d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.225488 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7f67697d54-9s42z"] Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.228774 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c2d0533c-95eb-41ee-af58-d85e832a41d3" (UID: "c2d0533c-95eb-41ee-af58-d85e832a41d3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.253939 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d0533c-95eb-41ee-af58-d85e832a41d3-kube-api-access-gvqtl" (OuterVolumeSpecName: "kube-api-access-gvqtl") pod "c2d0533c-95eb-41ee-af58-d85e832a41d3" (UID: "c2d0533c-95eb-41ee-af58-d85e832a41d3"). InnerVolumeSpecName "kube-api-access-gvqtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.276145 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2d0533c-95eb-41ee-af58-d85e832a41d3" (UID: "c2d0533c-95eb-41ee-af58-d85e832a41d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.311162 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-config-data" (OuterVolumeSpecName: "config-data") pod "c2d0533c-95eb-41ee-af58-d85e832a41d3" (UID: "c2d0533c-95eb-41ee-af58-d85e832a41d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.322584 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.322620 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvqtl\" (UniqueName: \"kubernetes.io/projected/c2d0533c-95eb-41ee-af58-d85e832a41d3-kube-api-access-gvqtl\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.322636 4914 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.322647 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2d0533c-95eb-41ee-af58-d85e832a41d3-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.322657 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2d0533c-95eb-41ee-af58-d85e832a41d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.338932 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.390940 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dkdht"] Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.391168 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" podUID="da9750a0-8f17-4cf7-9935-5da9c43a9a48" containerName="dnsmasq-dns" containerID="cri-o://1f86fcfd7a4a23f8ba0d8d72a74de48d2707adc205ba97def3888baccf6077ee" gracePeriod=10 Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.491593 4914 generic.go:334] "Generic (PLEG): container finished" podID="c2d0533c-95eb-41ee-af58-d85e832a41d3" containerID="09db82c5b7273909327fe13b74f0eac9fe7953343eefb01fd9a29b44cec5adba" exitCode=0 Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.491926 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-569449474-v9hfx" event={"ID":"c2d0533c-95eb-41ee-af58-d85e832a41d3","Type":"ContainerDied","Data":"09db82c5b7273909327fe13b74f0eac9fe7953343eefb01fd9a29b44cec5adba"} Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.492009 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-569449474-v9hfx" event={"ID":"c2d0533c-95eb-41ee-af58-d85e832a41d3","Type":"ContainerDied","Data":"95c7e4ee3206628abdc5309b1af973b80d030fceef6ac6a126acfc8d37728adb"} Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.492088 4914 scope.go:117] "RemoveContainer" containerID="09db82c5b7273909327fe13b74f0eac9fe7953343eefb01fd9a29b44cec5adba" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.492264 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-569449474-v9hfx" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.500306 4914 generic.go:334] "Generic (PLEG): container finished" podID="60b23541-8035-417c-8ea6-69cf3c0b2758" containerID="1cb7e0c8338c4e43b8e92ecacd04c310d38d3d5f776f226e015397c968c01590" exitCode=0 Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.500997 4914 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.501145 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7f67697d54-9s42z" podUID="2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" containerName="placement-log" containerID="cri-o://575be469f78a48937e8ec8cbb3339807ee05616b55a117e3cffaa41347bce35c" gracePeriod=30 Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.501238 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-m6cp5" event={"ID":"60b23541-8035-417c-8ea6-69cf3c0b2758","Type":"ContainerDied","Data":"1cb7e0c8338c4e43b8e92ecacd04c310d38d3d5f776f226e015397c968c01590"} Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.501350 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7f67697d54-9s42z" podUID="2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" containerName="placement-api" containerID="cri-o://df624bb50c29da314fb00868fd5987f09f1630d6d4682eab3c93a8ea78ec1c37" gracePeriod=30 Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.507109 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-7f67697d54-9s42z" podUID="2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.170:8778/\": EOF" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.511229 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-7f67697d54-9s42z" podUID="2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.170:8778/\": EOF" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.536192 4914 scope.go:117] "RemoveContainer" containerID="d744e6181492333e1babbb97873cfb973aa9345927d3f77bd5a539f61709e2d5" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.555875 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.568908 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-569449474-v9hfx"] Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.569594 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-569449474-v9hfx"] Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.599774 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.705492 4914 scope.go:117] "RemoveContainer" containerID="09db82c5b7273909327fe13b74f0eac9fe7953343eefb01fd9a29b44cec5adba" Jan 30 21:35:14 crc kubenswrapper[4914]: E0130 21:35:14.705940 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09db82c5b7273909327fe13b74f0eac9fe7953343eefb01fd9a29b44cec5adba\": container with ID starting with 09db82c5b7273909327fe13b74f0eac9fe7953343eefb01fd9a29b44cec5adba not found: ID does not exist" containerID="09db82c5b7273909327fe13b74f0eac9fe7953343eefb01fd9a29b44cec5adba" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.705977 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09db82c5b7273909327fe13b74f0eac9fe7953343eefb01fd9a29b44cec5adba"} err="failed to get container status \"09db82c5b7273909327fe13b74f0eac9fe7953343eefb01fd9a29b44cec5adba\": rpc error: code = NotFound desc = could not find container \"09db82c5b7273909327fe13b74f0eac9fe7953343eefb01fd9a29b44cec5adba\": container with ID starting with 09db82c5b7273909327fe13b74f0eac9fe7953343eefb01fd9a29b44cec5adba not found: ID does not exist" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.706007 4914 scope.go:117] "RemoveContainer" containerID="d744e6181492333e1babbb97873cfb973aa9345927d3f77bd5a539f61709e2d5" Jan 30 21:35:14 crc kubenswrapper[4914]: E0130 21:35:14.706324 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d744e6181492333e1babbb97873cfb973aa9345927d3f77bd5a539f61709e2d5\": container with ID starting with d744e6181492333e1babbb97873cfb973aa9345927d3f77bd5a539f61709e2d5 not found: ID does not exist" containerID="d744e6181492333e1babbb97873cfb973aa9345927d3f77bd5a539f61709e2d5" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.706363 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d744e6181492333e1babbb97873cfb973aa9345927d3f77bd5a539f61709e2d5"} err="failed to get container status \"d744e6181492333e1babbb97873cfb973aa9345927d3f77bd5a539f61709e2d5\": rpc error: code = NotFound desc = could not find container \"d744e6181492333e1babbb97873cfb973aa9345927d3f77bd5a539f61709e2d5\": container with ID starting with d744e6181492333e1babbb97873cfb973aa9345927d3f77bd5a539f61709e2d5 not found: ID does not exist" Jan 30 21:35:14 crc kubenswrapper[4914]: I0130 21:35:14.722568 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6c7c6d9b88-rpslq" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.487761 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.515166 4914 generic.go:334] "Generic (PLEG): container finished" podID="2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" containerID="575be469f78a48937e8ec8cbb3339807ee05616b55a117e3cffaa41347bce35c" exitCode=143 Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.515645 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f67697d54-9s42z" event={"ID":"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3","Type":"ContainerDied","Data":"575be469f78a48937e8ec8cbb3339807ee05616b55a117e3cffaa41347bce35c"} Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.519562 4914 generic.go:334] "Generic (PLEG): container finished" podID="da9750a0-8f17-4cf7-9935-5da9c43a9a48" containerID="1f86fcfd7a4a23f8ba0d8d72a74de48d2707adc205ba97def3888baccf6077ee" exitCode=0 Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.519775 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" event={"ID":"da9750a0-8f17-4cf7-9935-5da9c43a9a48","Type":"ContainerDied","Data":"1f86fcfd7a4a23f8ba0d8d72a74de48d2707adc205ba97def3888baccf6077ee"} Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.519885 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" event={"ID":"da9750a0-8f17-4cf7-9935-5da9c43a9a48","Type":"ContainerDied","Data":"9920a73b71161a0427fd4e58d516856f5d92d178f00d18b9da1cdfd44dc0de0b"} Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.519994 4914 scope.go:117] "RemoveContainer" containerID="1f86fcfd7a4a23f8ba0d8d72a74de48d2707adc205ba97def3888baccf6077ee" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.520217 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dkdht" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.522238 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" containerName="cinder-scheduler" containerID="cri-o://b0623e4ff177438bdaafaf776dfa21dd7415f8d0d183ba8f5e4aaf63fc915fc9" gracePeriod=30 Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.522372 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" containerName="probe" containerID="cri-o://2490b8d2ba5ad29849bea75befe9025b00558fff6f0d48f6425fb0c1c658116c" gracePeriod=30 Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.573302 4914 scope.go:117] "RemoveContainer" containerID="f5816a4859663460f7cf9d1486d797c11fbb2351964b9c0f7e063de33f91fb98" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.616598 4914 scope.go:117] "RemoveContainer" containerID="1f86fcfd7a4a23f8ba0d8d72a74de48d2707adc205ba97def3888baccf6077ee" Jan 30 21:35:15 crc kubenswrapper[4914]: E0130 21:35:15.618267 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f86fcfd7a4a23f8ba0d8d72a74de48d2707adc205ba97def3888baccf6077ee\": container with ID starting with 1f86fcfd7a4a23f8ba0d8d72a74de48d2707adc205ba97def3888baccf6077ee not found: ID does not exist" containerID="1f86fcfd7a4a23f8ba0d8d72a74de48d2707adc205ba97def3888baccf6077ee" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.618318 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f86fcfd7a4a23f8ba0d8d72a74de48d2707adc205ba97def3888baccf6077ee"} err="failed to get container status \"1f86fcfd7a4a23f8ba0d8d72a74de48d2707adc205ba97def3888baccf6077ee\": rpc error: code = NotFound desc = could not find container \"1f86fcfd7a4a23f8ba0d8d72a74de48d2707adc205ba97def3888baccf6077ee\": container with ID starting with 1f86fcfd7a4a23f8ba0d8d72a74de48d2707adc205ba97def3888baccf6077ee not found: ID does not exist" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.618353 4914 scope.go:117] "RemoveContainer" containerID="f5816a4859663460f7cf9d1486d797c11fbb2351964b9c0f7e063de33f91fb98" Jan 30 21:35:15 crc kubenswrapper[4914]: E0130 21:35:15.619356 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5816a4859663460f7cf9d1486d797c11fbb2351964b9c0f7e063de33f91fb98\": container with ID starting with f5816a4859663460f7cf9d1486d797c11fbb2351964b9c0f7e063de33f91fb98 not found: ID does not exist" containerID="f5816a4859663460f7cf9d1486d797c11fbb2351964b9c0f7e063de33f91fb98" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.619410 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5816a4859663460f7cf9d1486d797c11fbb2351964b9c0f7e063de33f91fb98"} err="failed to get container status \"f5816a4859663460f7cf9d1486d797c11fbb2351964b9c0f7e063de33f91fb98\": rpc error: code = NotFound desc = could not find container \"f5816a4859663460f7cf9d1486d797c11fbb2351964b9c0f7e063de33f91fb98\": container with ID starting with f5816a4859663460f7cf9d1486d797c11fbb2351964b9c0f7e063de33f91fb98 not found: ID does not exist" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.665845 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-ovsdbserver-nb\") pod \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.665931 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cktjv\" (UniqueName: \"kubernetes.io/projected/da9750a0-8f17-4cf7-9935-5da9c43a9a48-kube-api-access-cktjv\") pod \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.665979 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-config\") pod \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.665998 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-dns-svc\") pod \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.666020 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-ovsdbserver-sb\") pod \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.666157 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-dns-swift-storage-0\") pod \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\" (UID: \"da9750a0-8f17-4cf7-9935-5da9c43a9a48\") " Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.678052 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da9750a0-8f17-4cf7-9935-5da9c43a9a48-kube-api-access-cktjv" (OuterVolumeSpecName: "kube-api-access-cktjv") pod "da9750a0-8f17-4cf7-9935-5da9c43a9a48" (UID: "da9750a0-8f17-4cf7-9935-5da9c43a9a48"). InnerVolumeSpecName "kube-api-access-cktjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.764340 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da9750a0-8f17-4cf7-9935-5da9c43a9a48" (UID: "da9750a0-8f17-4cf7-9935-5da9c43a9a48"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.765420 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da9750a0-8f17-4cf7-9935-5da9c43a9a48" (UID: "da9750a0-8f17-4cf7-9935-5da9c43a9a48"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.768496 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.768522 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cktjv\" (UniqueName: \"kubernetes.io/projected/da9750a0-8f17-4cf7-9935-5da9c43a9a48-kube-api-access-cktjv\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.768535 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.780245 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-config" (OuterVolumeSpecName: "config") pod "da9750a0-8f17-4cf7-9935-5da9c43a9a48" (UID: "da9750a0-8f17-4cf7-9935-5da9c43a9a48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.830558 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da9750a0-8f17-4cf7-9935-5da9c43a9a48" (UID: "da9750a0-8f17-4cf7-9935-5da9c43a9a48"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.855381 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d0533c-95eb-41ee-af58-d85e832a41d3" path="/var/lib/kubelet/pods/c2d0533c-95eb-41ee-af58-d85e832a41d3/volumes" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.861339 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "da9750a0-8f17-4cf7-9935-5da9c43a9a48" (UID: "da9750a0-8f17-4cf7-9935-5da9c43a9a48"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.871758 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.872088 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.872101 4914 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9750a0-8f17-4cf7-9935-5da9c43a9a48-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:15 crc kubenswrapper[4914]: I0130 21:35:15.940198 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.075415 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-config-data\") pod \"60b23541-8035-417c-8ea6-69cf3c0b2758\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.075728 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-combined-ca-bundle\") pod \"60b23541-8035-417c-8ea6-69cf3c0b2758\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.076369 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-scripts\") pod \"60b23541-8035-417c-8ea6-69cf3c0b2758\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.076526 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cmv6\" (UniqueName: \"kubernetes.io/projected/60b23541-8035-417c-8ea6-69cf3c0b2758-kube-api-access-8cmv6\") pod \"60b23541-8035-417c-8ea6-69cf3c0b2758\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.076645 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/60b23541-8035-417c-8ea6-69cf3c0b2758-certs\") pod \"60b23541-8035-417c-8ea6-69cf3c0b2758\" (UID: \"60b23541-8035-417c-8ea6-69cf3c0b2758\") " Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.079778 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-scripts" (OuterVolumeSpecName: "scripts") pod "60b23541-8035-417c-8ea6-69cf3c0b2758" (UID: "60b23541-8035-417c-8ea6-69cf3c0b2758"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.079866 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b23541-8035-417c-8ea6-69cf3c0b2758-certs" (OuterVolumeSpecName: "certs") pod "60b23541-8035-417c-8ea6-69cf3c0b2758" (UID: "60b23541-8035-417c-8ea6-69cf3c0b2758"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.081019 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b23541-8035-417c-8ea6-69cf3c0b2758-kube-api-access-8cmv6" (OuterVolumeSpecName: "kube-api-access-8cmv6") pod "60b23541-8035-417c-8ea6-69cf3c0b2758" (UID: "60b23541-8035-417c-8ea6-69cf3c0b2758"). InnerVolumeSpecName "kube-api-access-8cmv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.104320 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-config-data" (OuterVolumeSpecName: "config-data") pod "60b23541-8035-417c-8ea6-69cf3c0b2758" (UID: "60b23541-8035-417c-8ea6-69cf3c0b2758"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.125800 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60b23541-8035-417c-8ea6-69cf3c0b2758" (UID: "60b23541-8035-417c-8ea6-69cf3c0b2758"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.156965 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dkdht"] Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.166218 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dkdht"] Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.179134 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.179165 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.179174 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cmv6\" (UniqueName: \"kubernetes.io/projected/60b23541-8035-417c-8ea6-69cf3c0b2758-kube-api-access-8cmv6\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.179182 4914 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/60b23541-8035-417c-8ea6-69cf3c0b2758-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.179190 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b23541-8035-417c-8ea6-69cf3c0b2758-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.549199 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-m6cp5" event={"ID":"60b23541-8035-417c-8ea6-69cf3c0b2758","Type":"ContainerDied","Data":"755b0f969cfdb1520a88cf0e967e08b00ed393c377024170fed685a29bb85780"} Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.549255 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="755b0f969cfdb1520a88cf0e967e08b00ed393c377024170fed685a29bb85780" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.549194 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-m6cp5" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.552640 4914 generic.go:334] "Generic (PLEG): container finished" podID="4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" containerID="2490b8d2ba5ad29849bea75befe9025b00558fff6f0d48f6425fb0c1c658116c" exitCode=0 Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.552717 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0","Type":"ContainerDied","Data":"2490b8d2ba5ad29849bea75befe9025b00558fff6f0d48f6425fb0c1c658116c"} Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.615226 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.817749 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:35:16 crc kubenswrapper[4914]: E0130 21:35:16.819490 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9750a0-8f17-4cf7-9935-5da9c43a9a48" containerName="dnsmasq-dns" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.819725 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9750a0-8f17-4cf7-9935-5da9c43a9a48" containerName="dnsmasq-dns" Jan 30 21:35:16 crc kubenswrapper[4914]: E0130 21:35:16.819743 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9750a0-8f17-4cf7-9935-5da9c43a9a48" containerName="init" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.819751 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9750a0-8f17-4cf7-9935-5da9c43a9a48" containerName="init" Jan 30 21:35:16 crc kubenswrapper[4914]: E0130 21:35:16.819764 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d0533c-95eb-41ee-af58-d85e832a41d3" containerName="barbican-api-log" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.819770 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d0533c-95eb-41ee-af58-d85e832a41d3" containerName="barbican-api-log" Jan 30 21:35:16 crc kubenswrapper[4914]: E0130 21:35:16.819781 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d0533c-95eb-41ee-af58-d85e832a41d3" containerName="barbican-api" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.819787 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d0533c-95eb-41ee-af58-d85e832a41d3" containerName="barbican-api" Jan 30 21:35:16 crc kubenswrapper[4914]: E0130 21:35:16.819798 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b23541-8035-417c-8ea6-69cf3c0b2758" containerName="cloudkitty-storageinit" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.819804 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b23541-8035-417c-8ea6-69cf3c0b2758" containerName="cloudkitty-storageinit" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.819960 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b23541-8035-417c-8ea6-69cf3c0b2758" containerName="cloudkitty-storageinit" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.819974 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d0533c-95eb-41ee-af58-d85e832a41d3" containerName="barbican-api-log" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.819981 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d0533c-95eb-41ee-af58-d85e832a41d3" containerName="barbican-api" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.819997 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="da9750a0-8f17-4cf7-9935-5da9c43a9a48" containerName="dnsmasq-dns" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.820631 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.825173 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.825757 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.826023 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.826043 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.826180 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-tfj5s" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.840016 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.890116 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.891438 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.897070 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.897245 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-494wg" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.897276 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.909775 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-spmt5"] Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.911502 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.927643 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.941923 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-spmt5"] Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.996629 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-config-data\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.996674 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-certs\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.998520 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psf8r\" (UniqueName: \"kubernetes.io/projected/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-kube-api-access-psf8r\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.998548 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " pod="openstack/openstackclient" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.998566 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf26p\" (UniqueName: \"kubernetes.io/projected/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-kube-api-access-tf26p\") pod \"openstackclient\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " pod="openstack/openstackclient" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.998584 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-scripts\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.998608 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-openstack-config\") pod \"openstackclient\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " pod="openstack/openstackclient" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.998671 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-openstack-config-secret\") pod \"openstackclient\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " pod="openstack/openstackclient" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.998764 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:16 crc kubenswrapper[4914]: I0130 21:35:16.998782 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.021102 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.023301 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.026233 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.067759 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.100578 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.100626 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c0c93db6-0100-4770-886a-cf59be2d4f4e-certs\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.100651 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47sjm\" (UniqueName: \"kubernetes.io/projected/f7b96580-046d-4a22-b866-78dd55234c0a-kube-api-access-47sjm\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.100679 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-openstack-config-secret\") pod \"openstackclient\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.100782 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gflpk\" (UniqueName: \"kubernetes.io/projected/c0c93db6-0100-4770-886a-cf59be2d4f4e-kube-api-access-gflpk\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.100884 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0c93db6-0100-4770-886a-cf59be2d4f4e-logs\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.100971 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.101057 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.101086 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.101124 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-config-data\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.101189 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-config-data\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.101219 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-certs\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.101279 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-dns-svc\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.101907 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.101957 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-config\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.102016 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.102051 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.102067 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.102084 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psf8r\" (UniqueName: \"kubernetes.io/projected/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-kube-api-access-psf8r\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.102105 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf26p\" (UniqueName: \"kubernetes.io/projected/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-kube-api-access-tf26p\") pod \"openstackclient\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.102126 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-scripts\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.102165 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-openstack-config\") pod \"openstackclient\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.102188 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-scripts\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.103353 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-openstack-config\") pod \"openstackclient\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.107850 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-config-data\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.108749 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-scripts\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.113186 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-openstack-config-secret\") pod \"openstackclient\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.114357 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-certs\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.114899 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.115263 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.118490 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psf8r\" (UniqueName: \"kubernetes.io/projected/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-kube-api-access-psf8r\") pod \"cloudkitty-proc-0\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.123351 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.129429 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf26p\" (UniqueName: \"kubernetes.io/projected/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-kube-api-access-tf26p\") pod \"openstackclient\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.142885 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.211890 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47sjm\" (UniqueName: \"kubernetes.io/projected/f7b96580-046d-4a22-b866-78dd55234c0a-kube-api-access-47sjm\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.211951 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gflpk\" (UniqueName: \"kubernetes.io/projected/c0c93db6-0100-4770-886a-cf59be2d4f4e-kube-api-access-gflpk\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.211992 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0c93db6-0100-4770-886a-cf59be2d4f4e-logs\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.212024 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.212067 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-config-data\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.212125 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-dns-svc\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.212182 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.212212 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-config\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.212247 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.212273 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.212310 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-scripts\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.212336 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.212366 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c0c93db6-0100-4770-886a-cf59be2d4f4e-certs\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.217246 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c0c93db6-0100-4770-886a-cf59be2d4f4e-certs\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.218082 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0c93db6-0100-4770-886a-cf59be2d4f4e-logs\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.218739 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.222536 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-config-data\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.223082 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-dns-svc\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.223171 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.224741 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.224904 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-config\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.225336 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.231558 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.235349 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.241921 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-scripts\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.258197 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47sjm\" (UniqueName: \"kubernetes.io/projected/f7b96580-046d-4a22-b866-78dd55234c0a-kube-api-access-47sjm\") pod \"dnsmasq-dns-67bdc55879-spmt5\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.274532 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gflpk\" (UniqueName: \"kubernetes.io/projected/c0c93db6-0100-4770-886a-cf59be2d4f4e-kube-api-access-gflpk\") pod \"cloudkitty-api-0\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.282795 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.356106 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.371346 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.474693 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.496940 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.536869 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.549796 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 21:35:17 crc kubenswrapper[4914]: E0130 21:35:17.554308 4914 log.go:32] "RunPodSandbox from runtime service failed" err=< Jan 30 21:35:17 crc kubenswrapper[4914]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_f89cb7ff-7802-4832-b14a-3e41fe5c0e4d_0(cb3cecf6374a6d8c295cf5a664e787e9087a54373a771417259a4c7f571c186b): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cb3cecf6374a6d8c295cf5a664e787e9087a54373a771417259a4c7f571c186b" Netns:"/var/run/netns/8f2a2cbd-1c27-425d-a914-f6da869bb4fc" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=cb3cecf6374a6d8c295cf5a664e787e9087a54373a771417259a4c7f571c186b;K8S_POD_UID=f89cb7ff-7802-4832-b14a-3e41fe5c0e4d" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d]: expected pod UID "f89cb7ff-7802-4832-b14a-3e41fe5c0e4d" but got "ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed" from Kube API Jan 30 21:35:17 crc kubenswrapper[4914]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 21:35:17 crc kubenswrapper[4914]: > Jan 30 21:35:17 crc kubenswrapper[4914]: E0130 21:35:17.554361 4914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Jan 30 21:35:17 crc kubenswrapper[4914]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_f89cb7ff-7802-4832-b14a-3e41fe5c0e4d_0(cb3cecf6374a6d8c295cf5a664e787e9087a54373a771417259a4c7f571c186b): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cb3cecf6374a6d8c295cf5a664e787e9087a54373a771417259a4c7f571c186b" Netns:"/var/run/netns/8f2a2cbd-1c27-425d-a914-f6da869bb4fc" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=cb3cecf6374a6d8c295cf5a664e787e9087a54373a771417259a4c7f571c186b;K8S_POD_UID=f89cb7ff-7802-4832-b14a-3e41fe5c0e4d" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d]: expected pod UID "f89cb7ff-7802-4832-b14a-3e41fe5c0e4d" but got "ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed" from Kube API Jan 30 21:35:17 crc kubenswrapper[4914]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 21:35:17 crc kubenswrapper[4914]: > pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.678947 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss6z9\" (UniqueName: \"kubernetes.io/projected/ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed-kube-api-access-ss6z9\") pod \"openstackclient\" (UID: \"ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.679254 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed-openstack-config\") pod \"openstackclient\" (UID: \"ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.679289 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.679370 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed-openstack-config-secret\") pod \"openstackclient\" (UID: \"ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.781960 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss6z9\" (UniqueName: \"kubernetes.io/projected/ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed-kube-api-access-ss6z9\") pod \"openstackclient\" (UID: \"ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.782032 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed-openstack-config\") pod \"openstackclient\" (UID: \"ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.782085 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.782138 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed-openstack-config-secret\") pod \"openstackclient\" (UID: \"ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.784128 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed-openstack-config\") pod \"openstackclient\" (UID: \"ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.802388 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed-openstack-config-secret\") pod \"openstackclient\" (UID: \"ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.805618 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.820648 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss6z9\" (UniqueName: \"kubernetes.io/projected/ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed-kube-api-access-ss6z9\") pod \"openstackclient\" (UID: \"ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed\") " pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.853929 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da9750a0-8f17-4cf7-9935-5da9c43a9a48" path="/var/lib/kubelet/pods/da9750a0-8f17-4cf7-9935-5da9c43a9a48/volumes" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.869161 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 21:35:17 crc kubenswrapper[4914]: I0130 21:35:17.941665 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.166902 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-spmt5"] Jan 30 21:35:18 crc kubenswrapper[4914]: W0130 21:35:18.175852 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0c93db6_0100_4770_886a_cf59be2d4f4e.slice/crio-38c4470eccc77981f24a1d5e66972aa4829f1a15186e8b9ae8104235bc546318 WatchSource:0}: Error finding container 38c4470eccc77981f24a1d5e66972aa4829f1a15186e8b9ae8104235bc546318: Status 404 returned error can't find the container with id 38c4470eccc77981f24a1d5e66972aa4829f1a15186e8b9ae8104235bc546318 Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.180829 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.537023 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 21:35:18 crc kubenswrapper[4914]: W0130 21:35:18.541153 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce7a9cdf_dbb2_4055_a31e_b0fb2771bfed.slice/crio-9a6ef248498a8251312ac7f4a49f08e52361c5232fb45becd9ff0fe210345b24 WatchSource:0}: Error finding container 9a6ef248498a8251312ac7f4a49f08e52361c5232fb45becd9ff0fe210345b24: Status 404 returned error can't find the container with id 9a6ef248498a8251312ac7f4a49f08e52361c5232fb45becd9ff0fe210345b24 Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.607885 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"3ecfd996-1b82-40f5-a3e2-fc926b8806a9","Type":"ContainerStarted","Data":"f4553125a8dec3107c63be7e9b7ff3c6201b12408096b1647879955b1bb1f363"} Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.609148 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-spmt5" event={"ID":"f7b96580-046d-4a22-b866-78dd55234c0a","Type":"ContainerStarted","Data":"93a1d1af276d6d1b71136f60aef2e47570670954af733b34885998c3f1f31a06"} Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.613867 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed","Type":"ContainerStarted","Data":"9a6ef248498a8251312ac7f4a49f08e52361c5232fb45becd9ff0fe210345b24"} Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.615770 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c0c93db6-0100-4770-886a-cf59be2d4f4e","Type":"ContainerStarted","Data":"38c4470eccc77981f24a1d5e66972aa4829f1a15186e8b9ae8104235bc546318"} Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.622207 4914 generic.go:334] "Generic (PLEG): container finished" podID="2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" containerID="df624bb50c29da314fb00868fd5987f09f1630d6d4682eab3c93a8ea78ec1c37" exitCode=0 Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.622290 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.622764 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f67697d54-9s42z" event={"ID":"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3","Type":"ContainerDied","Data":"df624bb50c29da314fb00868fd5987f09f1630d6d4682eab3c93a8ea78ec1c37"} Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.726380 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.730054 4914 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f89cb7ff-7802-4832-b14a-3e41fe5c0e4d" podUID="ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed" Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.822817 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-openstack-config\") pod \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.822951 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf26p\" (UniqueName: \"kubernetes.io/projected/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-kube-api-access-tf26p\") pod \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.822994 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-openstack-config-secret\") pod \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.823044 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-combined-ca-bundle\") pod \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\" (UID: \"f89cb7ff-7802-4832-b14a-3e41fe5c0e4d\") " Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.825688 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f89cb7ff-7802-4832-b14a-3e41fe5c0e4d" (UID: "f89cb7ff-7802-4832-b14a-3e41fe5c0e4d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.842408 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-kube-api-access-tf26p" (OuterVolumeSpecName: "kube-api-access-tf26p") pod "f89cb7ff-7802-4832-b14a-3e41fe5c0e4d" (UID: "f89cb7ff-7802-4832-b14a-3e41fe5c0e4d"). InnerVolumeSpecName "kube-api-access-tf26p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.844881 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f89cb7ff-7802-4832-b14a-3e41fe5c0e4d" (UID: "f89cb7ff-7802-4832-b14a-3e41fe5c0e4d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.846837 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f89cb7ff-7802-4832-b14a-3e41fe5c0e4d" (UID: "f89cb7ff-7802-4832-b14a-3e41fe5c0e4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.925087 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf26p\" (UniqueName: \"kubernetes.io/projected/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-kube-api-access-tf26p\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.925437 4914 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.925451 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:18 crc kubenswrapper[4914]: I0130 21:35:18.925462 4914 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.090785 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.146297 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-public-tls-certs\") pod \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.146374 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-internal-tls-certs\") pod \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.146416 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-logs\") pod \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.146471 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9nq7\" (UniqueName: \"kubernetes.io/projected/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-kube-api-access-f9nq7\") pod \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.146495 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-scripts\") pod \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.146511 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-config-data\") pod \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.146529 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-combined-ca-bundle\") pod \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\" (UID: \"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3\") " Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.147228 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-logs" (OuterVolumeSpecName: "logs") pod "2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" (UID: "2f02d8a4-592b-4b89-aab2-28dbe6d57ec3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.150021 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-kube-api-access-f9nq7" (OuterVolumeSpecName: "kube-api-access-f9nq7") pod "2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" (UID: "2f02d8a4-592b-4b89-aab2-28dbe6d57ec3"). InnerVolumeSpecName "kube-api-access-f9nq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.181822 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-scripts" (OuterVolumeSpecName: "scripts") pod "2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" (UID: "2f02d8a4-592b-4b89-aab2-28dbe6d57ec3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.248389 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.248436 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9nq7\" (UniqueName: \"kubernetes.io/projected/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-kube-api-access-f9nq7\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.248449 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.457698 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-config-data" (OuterVolumeSpecName: "config-data") pod "2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" (UID: "2f02d8a4-592b-4b89-aab2-28dbe6d57ec3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.475827 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" (UID: "2f02d8a4-592b-4b89-aab2-28dbe6d57ec3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.477338 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.522829 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" (UID: "2f02d8a4-592b-4b89-aab2-28dbe6d57ec3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.557331 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-config-data\") pod \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.557416 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-combined-ca-bundle\") pod \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.557576 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-scripts\") pod \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.557610 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-config-data-custom\") pod \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.557671 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdgxz\" (UniqueName: \"kubernetes.io/projected/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-kube-api-access-rdgxz\") pod \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.557698 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-etc-machine-id\") pod \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\" (UID: \"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0\") " Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.557880 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" (UID: "2f02d8a4-592b-4b89-aab2-28dbe6d57ec3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.558299 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.558320 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.558354 4914 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.558363 4914 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.558867 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" (UID: "4e6778a2-f469-4b92-b7d1-a5545b9f9ed0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.562979 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-scripts" (OuterVolumeSpecName: "scripts") pod "4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" (UID: "4e6778a2-f469-4b92-b7d1-a5545b9f9ed0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.573679 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-kube-api-access-rdgxz" (OuterVolumeSpecName: "kube-api-access-rdgxz") pod "4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" (UID: "4e6778a2-f469-4b92-b7d1-a5545b9f9ed0"). InnerVolumeSpecName "kube-api-access-rdgxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.573805 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" (UID: "4e6778a2-f469-4b92-b7d1-a5545b9f9ed0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.642417 4914 generic.go:334] "Generic (PLEG): container finished" podID="4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" containerID="b0623e4ff177438bdaafaf776dfa21dd7415f8d0d183ba8f5e4aaf63fc915fc9" exitCode=0 Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.642468 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.642536 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0","Type":"ContainerDied","Data":"b0623e4ff177438bdaafaf776dfa21dd7415f8d0d183ba8f5e4aaf63fc915fc9"} Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.642570 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4e6778a2-f469-4b92-b7d1-a5545b9f9ed0","Type":"ContainerDied","Data":"e76f8fed8850259dc4e6c89cc7a6fe3c2903f27a9f539c18998471bad6e1c5af"} Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.642588 4914 scope.go:117] "RemoveContainer" containerID="2490b8d2ba5ad29849bea75befe9025b00558fff6f0d48f6425fb0c1c658116c" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.647953 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f67697d54-9s42z" event={"ID":"2f02d8a4-592b-4b89-aab2-28dbe6d57ec3","Type":"ContainerDied","Data":"14b8d58021de566c4764d74583036778a2e0ed40c527b6288c4d67a38cda0e0a"} Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.648068 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f67697d54-9s42z" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.652922 4914 generic.go:334] "Generic (PLEG): container finished" podID="f7b96580-046d-4a22-b866-78dd55234c0a" containerID="88f7ac1c0cdd9115908efb204c089ddf0248133f9f26ab03074090a884bf99c1" exitCode=0 Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.652977 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-spmt5" event={"ID":"f7b96580-046d-4a22-b866-78dd55234c0a","Type":"ContainerDied","Data":"88f7ac1c0cdd9115908efb204c089ddf0248133f9f26ab03074090a884bf99c1"} Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.657817 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c0c93db6-0100-4770-886a-cf59be2d4f4e","Type":"ContainerStarted","Data":"9a2585871d7a8b9c8f9044e4c92aa4188006a57b48001c4651bac140dcfc6834"} Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.659611 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.661778 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.661814 4914 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.661827 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdgxz\" (UniqueName: \"kubernetes.io/projected/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-kube-api-access-rdgxz\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.661839 4914 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.714916 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" (UID: "4e6778a2-f469-4b92-b7d1-a5545b9f9ed0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.810432 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.838546 4914 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f89cb7ff-7802-4832-b14a-3e41fe5c0e4d" podUID="ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.921291 4914 scope.go:117] "RemoveContainer" containerID="b0623e4ff177438bdaafaf776dfa21dd7415f8d0d183ba8f5e4aaf63fc915fc9" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.935922 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-config-data" (OuterVolumeSpecName: "config-data") pod "4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" (UID: "4e6778a2-f469-4b92-b7d1-a5545b9f9ed0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.940211 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f89cb7ff-7802-4832-b14a-3e41fe5c0e4d" path="/var/lib/kubelet/pods/f89cb7ff-7802-4832-b14a-3e41fe5c0e4d/volumes" Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.940946 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7f67697d54-9s42z"] Jan 30 21:35:19 crc kubenswrapper[4914]: I0130 21:35:19.960550 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7f67697d54-9s42z"] Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.020276 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.080164 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.092254 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.100375 4914 scope.go:117] "RemoveContainer" containerID="2490b8d2ba5ad29849bea75befe9025b00558fff6f0d48f6425fb0c1c658116c" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.100473 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:35:20 crc kubenswrapper[4914]: E0130 21:35:20.100921 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" containerName="probe" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.100938 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" containerName="probe" Jan 30 21:35:20 crc kubenswrapper[4914]: E0130 21:35:20.100955 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" containerName="cinder-scheduler" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.100962 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" containerName="cinder-scheduler" Jan 30 21:35:20 crc kubenswrapper[4914]: E0130 21:35:20.100973 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" containerName="placement-log" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.100979 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" containerName="placement-log" Jan 30 21:35:20 crc kubenswrapper[4914]: E0130 21:35:20.101005 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" containerName="placement-api" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.101011 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" containerName="placement-api" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.101193 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" containerName="cinder-scheduler" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.101211 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" containerName="placement-log" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.101223 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" containerName="probe" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.101242 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" containerName="placement-api" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.102318 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: E0130 21:35:20.105466 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2490b8d2ba5ad29849bea75befe9025b00558fff6f0d48f6425fb0c1c658116c\": container with ID starting with 2490b8d2ba5ad29849bea75befe9025b00558fff6f0d48f6425fb0c1c658116c not found: ID does not exist" containerID="2490b8d2ba5ad29849bea75befe9025b00558fff6f0d48f6425fb0c1c658116c" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.105505 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.105499 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2490b8d2ba5ad29849bea75befe9025b00558fff6f0d48f6425fb0c1c658116c"} err="failed to get container status \"2490b8d2ba5ad29849bea75befe9025b00558fff6f0d48f6425fb0c1c658116c\": rpc error: code = NotFound desc = could not find container \"2490b8d2ba5ad29849bea75befe9025b00558fff6f0d48f6425fb0c1c658116c\": container with ID starting with 2490b8d2ba5ad29849bea75befe9025b00558fff6f0d48f6425fb0c1c658116c not found: ID does not exist" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.105667 4914 scope.go:117] "RemoveContainer" containerID="b0623e4ff177438bdaafaf776dfa21dd7415f8d0d183ba8f5e4aaf63fc915fc9" Jan 30 21:35:20 crc kubenswrapper[4914]: E0130 21:35:20.106041 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0623e4ff177438bdaafaf776dfa21dd7415f8d0d183ba8f5e4aaf63fc915fc9\": container with ID starting with b0623e4ff177438bdaafaf776dfa21dd7415f8d0d183ba8f5e4aaf63fc915fc9 not found: ID does not exist" containerID="b0623e4ff177438bdaafaf776dfa21dd7415f8d0d183ba8f5e4aaf63fc915fc9" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.106079 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0623e4ff177438bdaafaf776dfa21dd7415f8d0d183ba8f5e4aaf63fc915fc9"} err="failed to get container status \"b0623e4ff177438bdaafaf776dfa21dd7415f8d0d183ba8f5e4aaf63fc915fc9\": rpc error: code = NotFound desc = could not find container \"b0623e4ff177438bdaafaf776dfa21dd7415f8d0d183ba8f5e4aaf63fc915fc9\": container with ID starting with b0623e4ff177438bdaafaf776dfa21dd7415f8d0d183ba8f5e4aaf63fc915fc9 not found: ID does not exist" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.106112 4914 scope.go:117] "RemoveContainer" containerID="df624bb50c29da314fb00868fd5987f09f1630d6d4682eab3c93a8ea78ec1c37" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.107969 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.157617 4914 scope.go:117] "RemoveContainer" containerID="575be469f78a48937e8ec8cbb3339807ee05616b55a117e3cffaa41347bce35c" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.228384 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d44aada2-0a99-4783-89f2-55ccde6477d7-scripts\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.228488 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d44aada2-0a99-4783-89f2-55ccde6477d7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.228543 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d44aada2-0a99-4783-89f2-55ccde6477d7-config-data\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.228563 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jftgz\" (UniqueName: \"kubernetes.io/projected/d44aada2-0a99-4783-89f2-55ccde6477d7-kube-api-access-jftgz\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.228597 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d44aada2-0a99-4783-89f2-55ccde6477d7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.228629 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d44aada2-0a99-4783-89f2-55ccde6477d7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.323749 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.331586 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d44aada2-0a99-4783-89f2-55ccde6477d7-scripts\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.331660 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d44aada2-0a99-4783-89f2-55ccde6477d7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.331724 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d44aada2-0a99-4783-89f2-55ccde6477d7-config-data\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.331746 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jftgz\" (UniqueName: \"kubernetes.io/projected/d44aada2-0a99-4783-89f2-55ccde6477d7-kube-api-access-jftgz\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.331770 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d44aada2-0a99-4783-89f2-55ccde6477d7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.331795 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d44aada2-0a99-4783-89f2-55ccde6477d7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.332432 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d44aada2-0a99-4783-89f2-55ccde6477d7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.336052 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d44aada2-0a99-4783-89f2-55ccde6477d7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.336924 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d44aada2-0a99-4783-89f2-55ccde6477d7-scripts\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.337419 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d44aada2-0a99-4783-89f2-55ccde6477d7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.337801 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d44aada2-0a99-4783-89f2-55ccde6477d7-config-data\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.351188 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jftgz\" (UniqueName: \"kubernetes.io/projected/d44aada2-0a99-4783-89f2-55ccde6477d7-kube-api-access-jftgz\") pod \"cinder-scheduler-0\" (UID: \"d44aada2-0a99-4783-89f2-55ccde6477d7\") " pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.443278 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.676648 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c0c93db6-0100-4770-886a-cf59be2d4f4e","Type":"ContainerStarted","Data":"4573d5e7435afb9aee55621939091e94d53863795c7ec940ff305264b3df0694"} Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.677970 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.697354 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=4.697342376 podStartE2EDuration="4.697342376s" podCreationTimestamp="2026-01-30 21:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:20.696127187 +0000 UTC m=+1254.134763948" watchObservedRunningTime="2026-01-30 21:35:20.697342376 +0000 UTC m=+1254.135979137" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.701679 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-spmt5" event={"ID":"f7b96580-046d-4a22-b866-78dd55234c0a","Type":"ContainerStarted","Data":"7226b109ee0610df021d7b028718ba70f315cd24195d3bf0b5a1738ea52be53b"} Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.702124 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.933336 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67bdc55879-spmt5" podStartSLOduration=4.933319311 podStartE2EDuration="4.933319311s" podCreationTimestamp="2026-01-30 21:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:20.737017836 +0000 UTC m=+1254.175654597" watchObservedRunningTime="2026-01-30 21:35:20.933319311 +0000 UTC m=+1254.371956072" Jan 30 21:35:20 crc kubenswrapper[4914]: I0130 21:35:20.937634 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:35:21 crc kubenswrapper[4914]: I0130 21:35:21.723686 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d44aada2-0a99-4783-89f2-55ccde6477d7","Type":"ContainerStarted","Data":"81d26006b11b8296b19b2e767b5fb81d8e2f6b6b7e4b95fe9d8dc3bcf086ca94"} Jan 30 21:35:21 crc kubenswrapper[4914]: I0130 21:35:21.724213 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="c0c93db6-0100-4770-886a-cf59be2d4f4e" containerName="cloudkitty-api-log" containerID="cri-o://9a2585871d7a8b9c8f9044e4c92aa4188006a57b48001c4651bac140dcfc6834" gracePeriod=30 Jan 30 21:35:21 crc kubenswrapper[4914]: I0130 21:35:21.724386 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="c0c93db6-0100-4770-886a-cf59be2d4f4e" containerName="cloudkitty-api" containerID="cri-o://4573d5e7435afb9aee55621939091e94d53863795c7ec940ff305264b3df0694" gracePeriod=30 Jan 30 21:35:21 crc kubenswrapper[4914]: I0130 21:35:21.838372 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f02d8a4-592b-4b89-aab2-28dbe6d57ec3" path="/var/lib/kubelet/pods/2f02d8a4-592b-4b89-aab2-28dbe6d57ec3/volumes" Jan 30 21:35:21 crc kubenswrapper[4914]: I0130 21:35:21.839033 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e6778a2-f469-4b92-b7d1-a5545b9f9ed0" path="/var/lib/kubelet/pods/4e6778a2-f469-4b92-b7d1-a5545b9f9ed0/volumes" Jan 30 21:35:22 crc kubenswrapper[4914]: I0130 21:35:22.761567 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d44aada2-0a99-4783-89f2-55ccde6477d7","Type":"ContainerStarted","Data":"a42028c14920f43ede158748fa09b97b5d6fb2b009f638defc5d1efbaee7946c"} Jan 30 21:35:22 crc kubenswrapper[4914]: I0130 21:35:22.764101 4914 generic.go:334] "Generic (PLEG): container finished" podID="c0c93db6-0100-4770-886a-cf59be2d4f4e" containerID="4573d5e7435afb9aee55621939091e94d53863795c7ec940ff305264b3df0694" exitCode=0 Jan 30 21:35:22 crc kubenswrapper[4914]: I0130 21:35:22.764139 4914 generic.go:334] "Generic (PLEG): container finished" podID="c0c93db6-0100-4770-886a-cf59be2d4f4e" containerID="9a2585871d7a8b9c8f9044e4c92aa4188006a57b48001c4651bac140dcfc6834" exitCode=143 Jan 30 21:35:22 crc kubenswrapper[4914]: I0130 21:35:22.764163 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c0c93db6-0100-4770-886a-cf59be2d4f4e","Type":"ContainerDied","Data":"4573d5e7435afb9aee55621939091e94d53863795c7ec940ff305264b3df0694"} Jan 30 21:35:22 crc kubenswrapper[4914]: I0130 21:35:22.764192 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c0c93db6-0100-4770-886a-cf59be2d4f4e","Type":"ContainerDied","Data":"9a2585871d7a8b9c8f9044e4c92aa4188006a57b48001c4651bac140dcfc6834"} Jan 30 21:35:22 crc kubenswrapper[4914]: I0130 21:35:22.999716 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.005400 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-scripts\") pod \"c0c93db6-0100-4770-886a-cf59be2d4f4e\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.005449 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0c93db6-0100-4770-886a-cf59be2d4f4e-logs\") pod \"c0c93db6-0100-4770-886a-cf59be2d4f4e\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.005484 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-config-data-custom\") pod \"c0c93db6-0100-4770-886a-cf59be2d4f4e\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.005502 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gflpk\" (UniqueName: \"kubernetes.io/projected/c0c93db6-0100-4770-886a-cf59be2d4f4e-kube-api-access-gflpk\") pod \"c0c93db6-0100-4770-886a-cf59be2d4f4e\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.005581 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c0c93db6-0100-4770-886a-cf59be2d4f4e-certs\") pod \"c0c93db6-0100-4770-886a-cf59be2d4f4e\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.005619 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-combined-ca-bundle\") pod \"c0c93db6-0100-4770-886a-cf59be2d4f4e\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.005748 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-config-data\") pod \"c0c93db6-0100-4770-886a-cf59be2d4f4e\" (UID: \"c0c93db6-0100-4770-886a-cf59be2d4f4e\") " Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.007085 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0c93db6-0100-4770-886a-cf59be2d4f4e-logs" (OuterVolumeSpecName: "logs") pod "c0c93db6-0100-4770-886a-cf59be2d4f4e" (UID: "c0c93db6-0100-4770-886a-cf59be2d4f4e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.012770 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c93db6-0100-4770-886a-cf59be2d4f4e-certs" (OuterVolumeSpecName: "certs") pod "c0c93db6-0100-4770-886a-cf59be2d4f4e" (UID: "c0c93db6-0100-4770-886a-cf59be2d4f4e"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.012829 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c93db6-0100-4770-886a-cf59be2d4f4e-kube-api-access-gflpk" (OuterVolumeSpecName: "kube-api-access-gflpk") pod "c0c93db6-0100-4770-886a-cf59be2d4f4e" (UID: "c0c93db6-0100-4770-886a-cf59be2d4f4e"). InnerVolumeSpecName "kube-api-access-gflpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.018901 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c0c93db6-0100-4770-886a-cf59be2d4f4e" (UID: "c0c93db6-0100-4770-886a-cf59be2d4f4e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.021918 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-scripts" (OuterVolumeSpecName: "scripts") pod "c0c93db6-0100-4770-886a-cf59be2d4f4e" (UID: "c0c93db6-0100-4770-886a-cf59be2d4f4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.038026 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-config-data" (OuterVolumeSpecName: "config-data") pod "c0c93db6-0100-4770-886a-cf59be2d4f4e" (UID: "c0c93db6-0100-4770-886a-cf59be2d4f4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.054130 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0c93db6-0100-4770-886a-cf59be2d4f4e" (UID: "c0c93db6-0100-4770-886a-cf59be2d4f4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.108627 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0c93db6-0100-4770-886a-cf59be2d4f4e-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.108670 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gflpk\" (UniqueName: \"kubernetes.io/projected/c0c93db6-0100-4770-886a-cf59be2d4f4e-kube-api-access-gflpk\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.108685 4914 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.108696 4914 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c0c93db6-0100-4770-886a-cf59be2d4f4e-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.108724 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.108919 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.108930 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0c93db6-0100-4770-886a-cf59be2d4f4e-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.785328 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d44aada2-0a99-4783-89f2-55ccde6477d7","Type":"ContainerStarted","Data":"85ed2287fec5eb1ae83001600aacf6ce5135871bd19257e0af34e0c6bb366bac"} Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.790099 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"3ecfd996-1b82-40f5-a3e2-fc926b8806a9","Type":"ContainerStarted","Data":"a81915c769dd52297ee0e2731ddc296f0bc2a2998ecc78d66d58f40a99b6941b"} Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.794053 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"c0c93db6-0100-4770-886a-cf59be2d4f4e","Type":"ContainerDied","Data":"38c4470eccc77981f24a1d5e66972aa4829f1a15186e8b9ae8104235bc546318"} Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.794100 4914 scope.go:117] "RemoveContainer" containerID="4573d5e7435afb9aee55621939091e94d53863795c7ec940ff305264b3df0694" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.794249 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.811551 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.811533391 podStartE2EDuration="3.811533391s" podCreationTimestamp="2026-01-30 21:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:23.805036106 +0000 UTC m=+1257.243672867" watchObservedRunningTime="2026-01-30 21:35:23.811533391 +0000 UTC m=+1257.250170152" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.846466 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=3.212546754 podStartE2EDuration="7.846447838s" podCreationTimestamp="2026-01-30 21:35:16 +0000 UTC" firstStartedPulling="2026-01-30 21:35:17.969166839 +0000 UTC m=+1251.407803600" lastFinishedPulling="2026-01-30 21:35:22.603067923 +0000 UTC m=+1256.041704684" observedRunningTime="2026-01-30 21:35:23.83694046 +0000 UTC m=+1257.275577221" watchObservedRunningTime="2026-01-30 21:35:23.846447838 +0000 UTC m=+1257.285084599" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.862089 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.865580 4914 scope.go:117] "RemoveContainer" containerID="9a2585871d7a8b9c8f9044e4c92aa4188006a57b48001c4651bac140dcfc6834" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.888751 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.909450 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.922767 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:35:23 crc kubenswrapper[4914]: E0130 21:35:23.923413 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c93db6-0100-4770-886a-cf59be2d4f4e" containerName="cloudkitty-api-log" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.923428 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c93db6-0100-4770-886a-cf59be2d4f4e" containerName="cloudkitty-api-log" Jan 30 21:35:23 crc kubenswrapper[4914]: E0130 21:35:23.923462 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c93db6-0100-4770-886a-cf59be2d4f4e" containerName="cloudkitty-api" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.923468 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c93db6-0100-4770-886a-cf59be2d4f4e" containerName="cloudkitty-api" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.923636 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c93db6-0100-4770-886a-cf59be2d4f4e" containerName="cloudkitty-api" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.923647 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c93db6-0100-4770-886a-cf59be2d4f4e" containerName="cloudkitty-api-log" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.924685 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.929039 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.929194 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.930018 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Jan 30 21:35:23 crc kubenswrapper[4914]: I0130 21:35:23.936114 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.027022 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.027136 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fea8049a-f388-4d46-a567-473849787e27-certs\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.027166 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.027206 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.027225 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtrpz\" (UniqueName: \"kubernetes.io/projected/fea8049a-f388-4d46-a567-473849787e27-kube-api-access-gtrpz\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.027255 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fea8049a-f388-4d46-a567-473849787e27-logs\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.027279 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.027301 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-scripts\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.027341 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-config-data\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.129126 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fea8049a-f388-4d46-a567-473849787e27-certs\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.129177 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.129226 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.129243 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtrpz\" (UniqueName: \"kubernetes.io/projected/fea8049a-f388-4d46-a567-473849787e27-kube-api-access-gtrpz\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.129273 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fea8049a-f388-4d46-a567-473849787e27-logs\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.129303 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.129328 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-scripts\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.129363 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-config-data\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.129408 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.132894 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fea8049a-f388-4d46-a567-473849787e27-logs\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.133445 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.135181 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.135511 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.135622 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-scripts\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.146925 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.148969 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fea8049a-f388-4d46-a567-473849787e27-certs\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.149805 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-config-data\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.154859 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtrpz\" (UniqueName: \"kubernetes.io/projected/fea8049a-f388-4d46-a567-473849787e27-kube-api-access-gtrpz\") pod \"cloudkitty-api-0\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.277334 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 30 21:35:24 crc kubenswrapper[4914]: I0130 21:35:24.898505 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:35:24 crc kubenswrapper[4914]: W0130 21:35:24.918143 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfea8049a_f388_4d46_a567_473849787e27.slice/crio-aa8dfaf6cc5ed51e13e3e77713dfaf32dac431020f3e9210ddddb131e34802f2 WatchSource:0}: Error finding container aa8dfaf6cc5ed51e13e3e77713dfaf32dac431020f3e9210ddddb131e34802f2: Status 404 returned error can't find the container with id aa8dfaf6cc5ed51e13e3e77713dfaf32dac431020f3e9210ddddb131e34802f2 Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.174450 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-79c4f49899-bc7gl"] Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.176098 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.179242 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.179443 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.179606 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.222672 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-79c4f49899-bc7gl"] Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.259316 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-public-tls-certs\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.259376 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-combined-ca-bundle\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.259397 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-config-data\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.259447 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-etc-swift\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.259515 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pthc4\" (UniqueName: \"kubernetes.io/projected/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-kube-api-access-pthc4\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.259538 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-log-httpd\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.259568 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-internal-tls-certs\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.259587 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-run-httpd\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.360846 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-public-tls-certs\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.360908 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-combined-ca-bundle\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.360930 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-config-data\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.360973 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-etc-swift\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.361035 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pthc4\" (UniqueName: \"kubernetes.io/projected/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-kube-api-access-pthc4\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.361055 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-log-httpd\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.361083 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-internal-tls-certs\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.361169 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-run-httpd\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.361598 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-log-httpd\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.368955 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-run-httpd\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.374393 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-public-tls-certs\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.374468 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-internal-tls-certs\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.381555 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-combined-ca-bundle\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.394595 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-config-data\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.395091 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-etc-swift\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.396162 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pthc4\" (UniqueName: \"kubernetes.io/projected/0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2-kube-api-access-pthc4\") pod \"swift-proxy-79c4f49899-bc7gl\" (UID: \"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2\") " pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.454857 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.512700 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.876192 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="3ecfd996-1b82-40f5-a3e2-fc926b8806a9" containerName="cloudkitty-proc" containerID="cri-o://a81915c769dd52297ee0e2731ddc296f0bc2a2998ecc78d66d58f40a99b6941b" gracePeriod=30 Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.879225 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0c93db6-0100-4770-886a-cf59be2d4f4e" path="/var/lib/kubelet/pods/c0c93db6-0100-4770-886a-cf59be2d4f4e/volumes" Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.880465 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"fea8049a-f388-4d46-a567-473849787e27","Type":"ContainerStarted","Data":"362ef2b0b58bcd117734262403641f928628a9463750d7f586b6a62c67828b49"} Jan 30 21:35:25 crc kubenswrapper[4914]: I0130 21:35:25.880495 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"fea8049a-f388-4d46-a567-473849787e27","Type":"ContainerStarted","Data":"aa8dfaf6cc5ed51e13e3e77713dfaf32dac431020f3e9210ddddb131e34802f2"} Jan 30 21:35:26 crc kubenswrapper[4914]: I0130 21:35:26.307367 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-79c4f49899-bc7gl"] Jan 30 21:35:26 crc kubenswrapper[4914]: I0130 21:35:26.886494 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79c4f49899-bc7gl" event={"ID":"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2","Type":"ContainerStarted","Data":"23d34e07e6c23bbeeb5aaa014e4389021081eec1f770c1d67a4b1764a58fdece"} Jan 30 21:35:26 crc kubenswrapper[4914]: I0130 21:35:26.983003 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:35:26 crc kubenswrapper[4914]: I0130 21:35:26.983060 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:35:26 crc kubenswrapper[4914]: I0130 21:35:26.983109 4914 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:35:26 crc kubenswrapper[4914]: I0130 21:35:26.983927 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0fa301f4a7d6f2d2094968ff039d7aedbb13e612ee90301cf0076f1904de139"} pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:35:26 crc kubenswrapper[4914]: I0130 21:35:26.983990 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" containerID="cri-o://f0fa301f4a7d6f2d2094968ff039d7aedbb13e612ee90301cf0076f1904de139" gracePeriod=600 Jan 30 21:35:27 crc kubenswrapper[4914]: I0130 21:35:27.541696 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:35:27 crc kubenswrapper[4914]: I0130 21:35:27.693804 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gpjkz"] Jan 30 21:35:27 crc kubenswrapper[4914]: I0130 21:35:27.694192 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" podUID="3b59a0a4-c433-4c7f-a789-b84c19ce532c" containerName="dnsmasq-dns" containerID="cri-o://6328dc5f4ff94d585fd3046f699039c93a02416277657d4dca3d9cc90eba6f45" gracePeriod=10 Jan 30 21:35:27 crc kubenswrapper[4914]: I0130 21:35:27.962891 4914 generic.go:334] "Generic (PLEG): container finished" podID="3b59a0a4-c433-4c7f-a789-b84c19ce532c" containerID="6328dc5f4ff94d585fd3046f699039c93a02416277657d4dca3d9cc90eba6f45" exitCode=0 Jan 30 21:35:27 crc kubenswrapper[4914]: I0130 21:35:27.966967 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=4.966951089 podStartE2EDuration="4.966951089s" podCreationTimestamp="2026-01-30 21:35:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:27.964254924 +0000 UTC m=+1261.402891685" watchObservedRunningTime="2026-01-30 21:35:27.966951089 +0000 UTC m=+1261.405587850" Jan 30 21:35:28 crc kubenswrapper[4914]: I0130 21:35:28.008280 4914 generic.go:334] "Generic (PLEG): container finished" podID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerID="f0fa301f4a7d6f2d2094968ff039d7aedbb13e612ee90301cf0076f1904de139" exitCode=0 Jan 30 21:35:28 crc kubenswrapper[4914]: I0130 21:35:28.517911 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"fea8049a-f388-4d46-a567-473849787e27","Type":"ContainerStarted","Data":"921990d8fa283366833ced4a3720567fbb50029acb68e25fcf13af86f5001b0f"} Jan 30 21:35:28 crc kubenswrapper[4914]: I0130 21:35:28.518209 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Jan 30 21:35:28 crc kubenswrapper[4914]: I0130 21:35:28.518220 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" event={"ID":"3b59a0a4-c433-4c7f-a789-b84c19ce532c","Type":"ContainerDied","Data":"6328dc5f4ff94d585fd3046f699039c93a02416277657d4dca3d9cc90eba6f45"} Jan 30 21:35:28 crc kubenswrapper[4914]: I0130 21:35:28.518233 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79c4f49899-bc7gl" event={"ID":"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2","Type":"ContainerStarted","Data":"b513002b0d29724b060ee26d88a259ddc600ca2ff78bdecf75d052a9f3bba969"} Jan 30 21:35:28 crc kubenswrapper[4914]: I0130 21:35:28.518247 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerDied","Data":"f0fa301f4a7d6f2d2094968ff039d7aedbb13e612ee90301cf0076f1904de139"} Jan 30 21:35:28 crc kubenswrapper[4914]: I0130 21:35:28.518269 4914 scope.go:117] "RemoveContainer" containerID="af122f4ba69a9f285a7275f9a58f9bcc4666b137ea591150601d02ec4dc641e5" Jan 30 21:35:28 crc kubenswrapper[4914]: I0130 21:35:28.944849 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.031154 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" event={"ID":"3b59a0a4-c433-4c7f-a789-b84c19ce532c","Type":"ContainerDied","Data":"7a1b6a576e77249deff47171ff3f0155744d68013a0ee7021611106af4fdc728"} Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.031506 4914 scope.go:117] "RemoveContainer" containerID="6328dc5f4ff94d585fd3046f699039c93a02416277657d4dca3d9cc90eba6f45" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.031613 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gpjkz" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.045805 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-dns-swift-storage-0\") pod \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.045868 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-config\") pod \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.045963 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm2rb\" (UniqueName: \"kubernetes.io/projected/3b59a0a4-c433-4c7f-a789-b84c19ce532c-kube-api-access-wm2rb\") pod \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.046004 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-ovsdbserver-nb\") pod \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.046095 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-ovsdbserver-sb\") pod \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.046137 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-dns-svc\") pod \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\" (UID: \"3b59a0a4-c433-4c7f-a789-b84c19ce532c\") " Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.051562 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79c4f49899-bc7gl" event={"ID":"0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2","Type":"ContainerStarted","Data":"d2ed08aca18bfb2084878e5bf971d0e246c39c33ae7259d29488aa9ba4bf37f9"} Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.075099 4914 scope.go:117] "RemoveContainer" containerID="585de9fe657511be774b9cac30c695df39751c4385d1c61e48dd4303aa9c27ce" Jan 30 21:35:29 crc kubenswrapper[4914]: E0130 21:35:29.102681 4914 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b59a0a4_c433_4c7f_a789_b84c19ce532c.slice/crio-conmon-6328dc5f4ff94d585fd3046f699039c93a02416277657d4dca3d9cc90eba6f45.scope\": RecentStats: unable to find data in memory cache]" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.143554 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-config" (OuterVolumeSpecName: "config") pod "3b59a0a4-c433-4c7f-a789-b84c19ce532c" (UID: "3b59a0a4-c433-4c7f-a789-b84c19ce532c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.149231 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.157517 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b59a0a4-c433-4c7f-a789-b84c19ce532c" (UID: "3b59a0a4-c433-4c7f-a789-b84c19ce532c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.170207 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b59a0a4-c433-4c7f-a789-b84c19ce532c" (UID: "3b59a0a4-c433-4c7f-a789-b84c19ce532c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.188435 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b59a0a4-c433-4c7f-a789-b84c19ce532c-kube-api-access-wm2rb" (OuterVolumeSpecName: "kube-api-access-wm2rb") pod "3b59a0a4-c433-4c7f-a789-b84c19ce532c" (UID: "3b59a0a4-c433-4c7f-a789-b84c19ce532c"). InnerVolumeSpecName "kube-api-access-wm2rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.188666 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b59a0a4-c433-4c7f-a789-b84c19ce532c" (UID: "3b59a0a4-c433-4c7f-a789-b84c19ce532c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.200268 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3b59a0a4-c433-4c7f-a789-b84c19ce532c" (UID: "3b59a0a4-c433-4c7f-a789-b84c19ce532c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.251090 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm2rb\" (UniqueName: \"kubernetes.io/projected/3b59a0a4-c433-4c7f-a789-b84c19ce532c-kube-api-access-wm2rb\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.251124 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.251133 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.251142 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.251150 4914 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b59a0a4-c433-4c7f-a789-b84c19ce532c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.559356 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gpjkz"] Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.568482 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gpjkz"] Jan 30 21:35:29 crc kubenswrapper[4914]: I0130 21:35:29.835035 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b59a0a4-c433-4c7f-a789-b84c19ce532c" path="/var/lib/kubelet/pods/3b59a0a4-c433-4c7f-a789-b84c19ce532c/volumes" Jan 30 21:35:30 crc kubenswrapper[4914]: I0130 21:35:30.078111 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"018dff8f009112f2d13f034fc24ae6b87f418ea17a0bfaeb82d8fef0d185a5d1"} Jan 30 21:35:30 crc kubenswrapper[4914]: I0130 21:35:30.078193 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:30 crc kubenswrapper[4914]: I0130 21:35:30.078234 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:30 crc kubenswrapper[4914]: I0130 21:35:30.110304 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-79c4f49899-bc7gl" podStartSLOduration=5.11028074 podStartE2EDuration="5.11028074s" podCreationTimestamp="2026-01-30 21:35:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:30.109439749 +0000 UTC m=+1263.548076550" watchObservedRunningTime="2026-01-30 21:35:30.11028074 +0000 UTC m=+1263.548917531" Jan 30 21:35:30 crc kubenswrapper[4914]: I0130 21:35:30.684841 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 21:35:31 crc kubenswrapper[4914]: I0130 21:35:31.219009 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 21:35:32 crc kubenswrapper[4914]: I0130 21:35:32.108177 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-79c4f49899-bc7gl" podUID="0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 30 21:35:33 crc kubenswrapper[4914]: I0130 21:35:33.108582 4914 generic.go:334] "Generic (PLEG): container finished" podID="3ecfd996-1b82-40f5-a3e2-fc926b8806a9" containerID="a81915c769dd52297ee0e2731ddc296f0bc2a2998ecc78d66d58f40a99b6941b" exitCode=0 Jan 30 21:35:33 crc kubenswrapper[4914]: I0130 21:35:33.108641 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"3ecfd996-1b82-40f5-a3e2-fc926b8806a9","Type":"ContainerDied","Data":"a81915c769dd52297ee0e2731ddc296f0bc2a2998ecc78d66d58f40a99b6941b"} Jan 30 21:35:34 crc kubenswrapper[4914]: I0130 21:35:34.048073 4914 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podb7fe1c6e-0858-479f-b365-081a1b8fcf2d"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podb7fe1c6e-0858-479f-b365-081a1b8fcf2d] : Timed out while waiting for systemd to remove kubepods-besteffort-podb7fe1c6e_0858_479f_b365_081a1b8fcf2d.slice" Jan 30 21:35:34 crc kubenswrapper[4914]: E0130 21:35:34.048134 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podb7fe1c6e-0858-479f-b365-081a1b8fcf2d] : unable to destroy cgroup paths for cgroup [kubepods besteffort podb7fe1c6e-0858-479f-b365-081a1b8fcf2d] : Timed out while waiting for systemd to remove kubepods-besteffort-podb7fe1c6e_0858_479f_b365_081a1b8fcf2d.slice" pod="openstack/cinder-db-sync-6kskl" podUID="b7fe1c6e-0858-479f-b365-081a1b8fcf2d" Jan 30 21:35:34 crc kubenswrapper[4914]: I0130 21:35:34.117429 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6kskl" Jan 30 21:35:34 crc kubenswrapper[4914]: I0130 21:35:34.336877 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:34 crc kubenswrapper[4914]: I0130 21:35:34.627055 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:34 crc kubenswrapper[4914]: I0130 21:35:34.647378 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="ceilometer-central-agent" containerID="cri-o://f6e9b1492fb6eec2a0336ad396bfbe2c8616fcfd57fbfb1adb2f3b85f1e2e7a5" gracePeriod=30 Jan 30 21:35:34 crc kubenswrapper[4914]: I0130 21:35:34.647675 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="ceilometer-notification-agent" containerID="cri-o://237ebdc66e6ff43178fee12f856d64b48e911d8986a371896cc650ecdc3809c8" gracePeriod=30 Jan 30 21:35:34 crc kubenswrapper[4914]: I0130 21:35:34.647856 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="sg-core" containerID="cri-o://6477d6d8d2aa0dff14d1edc8986a5f8e4b34cfe14d6f442480c3a05b21d7ff04" gracePeriod=30 Jan 30 21:35:34 crc kubenswrapper[4914]: I0130 21:35:34.647652 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="proxy-httpd" containerID="cri-o://e003dae359bf8c1ce30cd060252e6599786b150459d0ae7eb07985a3459eb48d" gracePeriod=30 Jan 30 21:35:35 crc kubenswrapper[4914]: I0130 21:35:35.134748 4914 generic.go:334] "Generic (PLEG): container finished" podID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerID="e003dae359bf8c1ce30cd060252e6599786b150459d0ae7eb07985a3459eb48d" exitCode=0 Jan 30 21:35:35 crc kubenswrapper[4914]: I0130 21:35:35.134779 4914 generic.go:334] "Generic (PLEG): container finished" podID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerID="6477d6d8d2aa0dff14d1edc8986a5f8e4b34cfe14d6f442480c3a05b21d7ff04" exitCode=2 Jan 30 21:35:35 crc kubenswrapper[4914]: I0130 21:35:35.134788 4914 generic.go:334] "Generic (PLEG): container finished" podID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerID="f6e9b1492fb6eec2a0336ad396bfbe2c8616fcfd57fbfb1adb2f3b85f1e2e7a5" exitCode=0 Jan 30 21:35:35 crc kubenswrapper[4914]: I0130 21:35:35.134800 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f3f352-1ffb-48b4-b985-0d2d2206c7c1","Type":"ContainerDied","Data":"e003dae359bf8c1ce30cd060252e6599786b150459d0ae7eb07985a3459eb48d"} Jan 30 21:35:35 crc kubenswrapper[4914]: I0130 21:35:35.134863 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f3f352-1ffb-48b4-b985-0d2d2206c7c1","Type":"ContainerDied","Data":"6477d6d8d2aa0dff14d1edc8986a5f8e4b34cfe14d6f442480c3a05b21d7ff04"} Jan 30 21:35:35 crc kubenswrapper[4914]: I0130 21:35:35.134876 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f3f352-1ffb-48b4-b985-0d2d2206c7c1","Type":"ContainerDied","Data":"f6e9b1492fb6eec2a0336ad396bfbe2c8616fcfd57fbfb1adb2f3b85f1e2e7a5"} Jan 30 21:35:35 crc kubenswrapper[4914]: I0130 21:35:35.518811 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:35 crc kubenswrapper[4914]: I0130 21:35:35.520034 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-79c4f49899-bc7gl" Jan 30 21:35:36 crc kubenswrapper[4914]: I0130 21:35:36.533652 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:35:36 crc kubenswrapper[4914]: I0130 21:35:36.534214 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="134b35c4-3656-4890-8cb2-76bc09779403" containerName="kube-state-metrics" containerID="cri-o://47c9c3cf033214c46eb2573058e39b836cf81ac9dfe8ebc208f5186a76e60b4c" gracePeriod=30 Jan 30 21:35:37 crc kubenswrapper[4914]: I0130 21:35:37.158083 4914 generic.go:334] "Generic (PLEG): container finished" podID="134b35c4-3656-4890-8cb2-76bc09779403" containerID="47c9c3cf033214c46eb2573058e39b836cf81ac9dfe8ebc208f5186a76e60b4c" exitCode=2 Jan 30 21:35:37 crc kubenswrapper[4914]: I0130 21:35:37.158223 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"134b35c4-3656-4890-8cb2-76bc09779403","Type":"ContainerDied","Data":"47c9c3cf033214c46eb2573058e39b836cf81ac9dfe8ebc208f5186a76e60b4c"} Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.165277 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.197617 4914 generic.go:334] "Generic (PLEG): container finished" podID="1ec90be3-5dfd-48aa-934c-70ef856a51c5" containerID="947c4b37c2af5121b8ffad8108987b966bc66f3d7f4b14041a26c5e61078604c" exitCode=137 Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.197812 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1ec90be3-5dfd-48aa-934c-70ef856a51c5","Type":"ContainerDied","Data":"947c4b37c2af5121b8ffad8108987b966bc66f3d7f4b14041a26c5e61078604c"} Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.201793 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"3ecfd996-1b82-40f5-a3e2-fc926b8806a9","Type":"ContainerDied","Data":"f4553125a8dec3107c63be7e9b7ff3c6201b12408096b1647879955b1bb1f363"} Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.201850 4914 scope.go:117] "RemoveContainer" containerID="a81915c769dd52297ee0e2731ddc296f0bc2a2998ecc78d66d58f40a99b6941b" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.202149 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.261605 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-config-data-custom\") pod \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.261748 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-scripts\") pod \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.261888 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-certs\") pod \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.261921 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-combined-ca-bundle\") pod \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.261997 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-config-data\") pod \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.262026 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psf8r\" (UniqueName: \"kubernetes.io/projected/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-kube-api-access-psf8r\") pod \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\" (UID: \"3ecfd996-1b82-40f5-a3e2-fc926b8806a9\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.271361 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-certs" (OuterVolumeSpecName: "certs") pod "3ecfd996-1b82-40f5-a3e2-fc926b8806a9" (UID: "3ecfd996-1b82-40f5-a3e2-fc926b8806a9"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.271453 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-kube-api-access-psf8r" (OuterVolumeSpecName: "kube-api-access-psf8r") pod "3ecfd996-1b82-40f5-a3e2-fc926b8806a9" (UID: "3ecfd996-1b82-40f5-a3e2-fc926b8806a9"). InnerVolumeSpecName "kube-api-access-psf8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.273478 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3ecfd996-1b82-40f5-a3e2-fc926b8806a9" (UID: "3ecfd996-1b82-40f5-a3e2-fc926b8806a9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.273507 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-scripts" (OuterVolumeSpecName: "scripts") pod "3ecfd996-1b82-40f5-a3e2-fc926b8806a9" (UID: "3ecfd996-1b82-40f5-a3e2-fc926b8806a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.314611 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-config-data" (OuterVolumeSpecName: "config-data") pod "3ecfd996-1b82-40f5-a3e2-fc926b8806a9" (UID: "3ecfd996-1b82-40f5-a3e2-fc926b8806a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.323620 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ecfd996-1b82-40f5-a3e2-fc926b8806a9" (UID: "3ecfd996-1b82-40f5-a3e2-fc926b8806a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.363955 4914 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.363982 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.364001 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.364012 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psf8r\" (UniqueName: \"kubernetes.io/projected/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-kube-api-access-psf8r\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.364024 4914 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.364034 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ecfd996-1b82-40f5-a3e2-fc926b8806a9-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: E0130 21:35:39.505085 4914 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63f3f352_1ffb_48b4_b985_0d2d2206c7c1.slice/crio-237ebdc66e6ff43178fee12f856d64b48e911d8986a371896cc650ecdc3809c8.scope\": RecentStats: unable to find data in memory cache]" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.675679 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.701257 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.709082 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.732867 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.750645 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:35:39 crc kubenswrapper[4914]: E0130 21:35:39.751162 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b59a0a4-c433-4c7f-a789-b84c19ce532c" containerName="dnsmasq-dns" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.751179 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b59a0a4-c433-4c7f-a789-b84c19ce532c" containerName="dnsmasq-dns" Jan 30 21:35:39 crc kubenswrapper[4914]: E0130 21:35:39.751200 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134b35c4-3656-4890-8cb2-76bc09779403" containerName="kube-state-metrics" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.751209 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="134b35c4-3656-4890-8cb2-76bc09779403" containerName="kube-state-metrics" Jan 30 21:35:39 crc kubenswrapper[4914]: E0130 21:35:39.751222 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ecfd996-1b82-40f5-a3e2-fc926b8806a9" containerName="cloudkitty-proc" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.751230 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ecfd996-1b82-40f5-a3e2-fc926b8806a9" containerName="cloudkitty-proc" Jan 30 21:35:39 crc kubenswrapper[4914]: E0130 21:35:39.751253 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b59a0a4-c433-4c7f-a789-b84c19ce532c" containerName="init" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.751260 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b59a0a4-c433-4c7f-a789-b84c19ce532c" containerName="init" Jan 30 21:35:39 crc kubenswrapper[4914]: E0130 21:35:39.751281 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec90be3-5dfd-48aa-934c-70ef856a51c5" containerName="cinder-api-log" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.751288 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec90be3-5dfd-48aa-934c-70ef856a51c5" containerName="cinder-api-log" Jan 30 21:35:39 crc kubenswrapper[4914]: E0130 21:35:39.751301 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec90be3-5dfd-48aa-934c-70ef856a51c5" containerName="cinder-api" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.751308 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec90be3-5dfd-48aa-934c-70ef856a51c5" containerName="cinder-api" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.751540 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b59a0a4-c433-4c7f-a789-b84c19ce532c" containerName="dnsmasq-dns" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.751558 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ecfd996-1b82-40f5-a3e2-fc926b8806a9" containerName="cloudkitty-proc" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.751577 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec90be3-5dfd-48aa-934c-70ef856a51c5" containerName="cinder-api-log" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.751586 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="134b35c4-3656-4890-8cb2-76bc09779403" containerName="kube-state-metrics" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.751599 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec90be3-5dfd-48aa-934c-70ef856a51c5" containerName="cinder-api" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.752627 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.757418 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.763412 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.777243 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knszm\" (UniqueName: \"kubernetes.io/projected/134b35c4-3656-4890-8cb2-76bc09779403-kube-api-access-knszm\") pod \"134b35c4-3656-4890-8cb2-76bc09779403\" (UID: \"134b35c4-3656-4890-8cb2-76bc09779403\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.784843 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134b35c4-3656-4890-8cb2-76bc09779403-kube-api-access-knszm" (OuterVolumeSpecName: "kube-api-access-knszm") pod "134b35c4-3656-4890-8cb2-76bc09779403" (UID: "134b35c4-3656-4890-8cb2-76bc09779403"). InnerVolumeSpecName "kube-api-access-knszm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.840391 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ecfd996-1b82-40f5-a3e2-fc926b8806a9" path="/var/lib/kubelet/pods/3ecfd996-1b82-40f5-a3e2-fc926b8806a9/volumes" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.869426 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.880683 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ec90be3-5dfd-48aa-934c-70ef856a51c5-logs\") pod \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.881219 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-config-data-custom\") pod \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.881269 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkz2n\" (UniqueName: \"kubernetes.io/projected/1ec90be3-5dfd-48aa-934c-70ef856a51c5-kube-api-access-nkz2n\") pod \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.881290 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ec90be3-5dfd-48aa-934c-70ef856a51c5-logs" (OuterVolumeSpecName: "logs") pod "1ec90be3-5dfd-48aa-934c-70ef856a51c5" (UID: "1ec90be3-5dfd-48aa-934c-70ef856a51c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.881307 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-combined-ca-bundle\") pod \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.881456 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-config-data\") pod \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.881495 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-scripts\") pod \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.881609 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ec90be3-5dfd-48aa-934c-70ef856a51c5-etc-machine-id\") pod \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\" (UID: \"1ec90be3-5dfd-48aa-934c-70ef856a51c5\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.882252 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.882399 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7tsk\" (UniqueName: \"kubernetes.io/projected/d069f103-1546-4a76-963e-2d160d5a347d-kube-api-access-v7tsk\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.882382 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ec90be3-5dfd-48aa-934c-70ef856a51c5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1ec90be3-5dfd-48aa-934c-70ef856a51c5" (UID: "1ec90be3-5dfd-48aa-934c-70ef856a51c5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.882605 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d069f103-1546-4a76-963e-2d160d5a347d-certs\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.882679 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-config-data\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.882890 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.882930 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-scripts\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.883082 4914 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ec90be3-5dfd-48aa-934c-70ef856a51c5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.883099 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knszm\" (UniqueName: \"kubernetes.io/projected/134b35c4-3656-4890-8cb2-76bc09779403-kube-api-access-knszm\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.883111 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ec90be3-5dfd-48aa-934c-70ef856a51c5-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.885616 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-scripts" (OuterVolumeSpecName: "scripts") pod "1ec90be3-5dfd-48aa-934c-70ef856a51c5" (UID: "1ec90be3-5dfd-48aa-934c-70ef856a51c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.894832 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1ec90be3-5dfd-48aa-934c-70ef856a51c5" (UID: "1ec90be3-5dfd-48aa-934c-70ef856a51c5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.901146 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec90be3-5dfd-48aa-934c-70ef856a51c5-kube-api-access-nkz2n" (OuterVolumeSpecName: "kube-api-access-nkz2n") pod "1ec90be3-5dfd-48aa-934c-70ef856a51c5" (UID: "1ec90be3-5dfd-48aa-934c-70ef856a51c5"). InnerVolumeSpecName "kube-api-access-nkz2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.932563 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ec90be3-5dfd-48aa-934c-70ef856a51c5" (UID: "1ec90be3-5dfd-48aa-934c-70ef856a51c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.951323 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-config-data" (OuterVolumeSpecName: "config-data") pod "1ec90be3-5dfd-48aa-934c-70ef856a51c5" (UID: "1ec90be3-5dfd-48aa-934c-70ef856a51c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.984054 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-scripts\") pod \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.984117 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6klk\" (UniqueName: \"kubernetes.io/projected/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-kube-api-access-z6klk\") pod \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.984138 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-log-httpd\") pod \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.984200 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-sg-core-conf-yaml\") pod \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.984276 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-config-data\") pod \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.984348 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-run-httpd\") pod \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.984726 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "63f3f352-1ffb-48b4-b985-0d2d2206c7c1" (UID: "63f3f352-1ffb-48b4-b985-0d2d2206c7c1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.984795 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-combined-ca-bundle\") pod \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\" (UID: \"63f3f352-1ffb-48b4-b985-0d2d2206c7c1\") " Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.984853 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "63f3f352-1ffb-48b4-b985-0d2d2206c7c1" (UID: "63f3f352-1ffb-48b4-b985-0d2d2206c7c1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.985219 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.985244 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-scripts\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.985322 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.985377 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7tsk\" (UniqueName: \"kubernetes.io/projected/d069f103-1546-4a76-963e-2d160d5a347d-kube-api-access-v7tsk\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.985417 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d069f103-1546-4a76-963e-2d160d5a347d-certs\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.985442 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-config-data\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.985514 4914 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.985524 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkz2n\" (UniqueName: \"kubernetes.io/projected/1ec90be3-5dfd-48aa-934c-70ef856a51c5-kube-api-access-nkz2n\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.985534 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.985542 4914 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.985550 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.985559 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ec90be3-5dfd-48aa-934c-70ef856a51c5-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.985567 4914 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.989409 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-scripts" (OuterVolumeSpecName: "scripts") pod "63f3f352-1ffb-48b4-b985-0d2d2206c7c1" (UID: "63f3f352-1ffb-48b4-b985-0d2d2206c7c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.989458 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-kube-api-access-z6klk" (OuterVolumeSpecName: "kube-api-access-z6klk") pod "63f3f352-1ffb-48b4-b985-0d2d2206c7c1" (UID: "63f3f352-1ffb-48b4-b985-0d2d2206c7c1"). InnerVolumeSpecName "kube-api-access-z6klk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.989663 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.990243 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d069f103-1546-4a76-963e-2d160d5a347d-certs\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.993231 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-config-data\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:39 crc kubenswrapper[4914]: I0130 21:35:39.993519 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.002764 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-scripts\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.005423 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7tsk\" (UniqueName: \"kubernetes.io/projected/d069f103-1546-4a76-963e-2d160d5a347d-kube-api-access-v7tsk\") pod \"cloudkitty-proc-0\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.017342 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "63f3f352-1ffb-48b4-b985-0d2d2206c7c1" (UID: "63f3f352-1ffb-48b4-b985-0d2d2206c7c1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.079976 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63f3f352-1ffb-48b4-b985-0d2d2206c7c1" (UID: "63f3f352-1ffb-48b4-b985-0d2d2206c7c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.085501 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.086945 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.086975 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.086984 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6klk\" (UniqueName: \"kubernetes.io/projected/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-kube-api-access-z6klk\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.086995 4914 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.095147 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-config-data" (OuterVolumeSpecName: "config-data") pod "63f3f352-1ffb-48b4-b985-0d2d2206c7c1" (UID: "63f3f352-1ffb-48b4-b985-0d2d2206c7c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.188287 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f3f352-1ffb-48b4-b985-0d2d2206c7c1-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.218115 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-594584649-k6kdl" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.232150 4914 generic.go:334] "Generic (PLEG): container finished" podID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerID="237ebdc66e6ff43178fee12f856d64b48e911d8986a371896cc650ecdc3809c8" exitCode=0 Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.232220 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.232230 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f3f352-1ffb-48b4-b985-0d2d2206c7c1","Type":"ContainerDied","Data":"237ebdc66e6ff43178fee12f856d64b48e911d8986a371896cc650ecdc3809c8"} Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.232258 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f3f352-1ffb-48b4-b985-0d2d2206c7c1","Type":"ContainerDied","Data":"11c941e0f9b3fbe702d97ee6634204afef0ea525191b8182593a646de41fed21"} Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.232275 4914 scope.go:117] "RemoveContainer" containerID="e003dae359bf8c1ce30cd060252e6599786b150459d0ae7eb07985a3459eb48d" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.237512 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1ec90be3-5dfd-48aa-934c-70ef856a51c5","Type":"ContainerDied","Data":"b7ac8414b0072fad275b965fb0dfb37793cf06c9b24a703a9efa63fd2f4ac6fe"} Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.237591 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.245038 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"134b35c4-3656-4890-8cb2-76bc09779403","Type":"ContainerDied","Data":"2fbaaf1a1f3846be96c93c078f9f73d1c804ee7b19f16f8275ff2b251a9d20df"} Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.245123 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.259775 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed","Type":"ContainerStarted","Data":"c92b84e19c4a5b9e65e0c79674173eb4fa8dbc2a52808b4fc6f636811e5de124"} Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.298992 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5789d46bdd-5kscc"] Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.299215 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5789d46bdd-5kscc" podUID="af8dbc06-6b83-49c0-9413-56a90165fb97" containerName="neutron-api" containerID="cri-o://0195835ed917d263a825d5070d67bbb21d31e2b7db9444936980af6a4d7a30e6" gracePeriod=30 Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.308510 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5789d46bdd-5kscc" podUID="af8dbc06-6b83-49c0-9413-56a90165fb97" containerName="neutron-httpd" containerID="cri-o://cd9fa6fd0ba56738d050e4726d6578839bcda9c1b3fee5ae13e7dd973ed8100a" gracePeriod=30 Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.342901 4914 scope.go:117] "RemoveContainer" containerID="6477d6d8d2aa0dff14d1edc8986a5f8e4b34cfe14d6f442480c3a05b21d7ff04" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.343984 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.393185 4914 scope.go:117] "RemoveContainer" containerID="237ebdc66e6ff43178fee12f856d64b48e911d8986a371896cc650ecdc3809c8" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.430285 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.436049 4914 scope.go:117] "RemoveContainer" containerID="f6e9b1492fb6eec2a0336ad396bfbe2c8616fcfd57fbfb1adb2f3b85f1e2e7a5" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.439201 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.450948 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.454816 4914 scope.go:117] "RemoveContainer" containerID="e003dae359bf8c1ce30cd060252e6599786b150459d0ae7eb07985a3459eb48d" Jan 30 21:35:40 crc kubenswrapper[4914]: E0130 21:35:40.455229 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e003dae359bf8c1ce30cd060252e6599786b150459d0ae7eb07985a3459eb48d\": container with ID starting with e003dae359bf8c1ce30cd060252e6599786b150459d0ae7eb07985a3459eb48d not found: ID does not exist" containerID="e003dae359bf8c1ce30cd060252e6599786b150459d0ae7eb07985a3459eb48d" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.455254 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e003dae359bf8c1ce30cd060252e6599786b150459d0ae7eb07985a3459eb48d"} err="failed to get container status \"e003dae359bf8c1ce30cd060252e6599786b150459d0ae7eb07985a3459eb48d\": rpc error: code = NotFound desc = could not find container \"e003dae359bf8c1ce30cd060252e6599786b150459d0ae7eb07985a3459eb48d\": container with ID starting with e003dae359bf8c1ce30cd060252e6599786b150459d0ae7eb07985a3459eb48d not found: ID does not exist" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.455432 4914 scope.go:117] "RemoveContainer" containerID="6477d6d8d2aa0dff14d1edc8986a5f8e4b34cfe14d6f442480c3a05b21d7ff04" Jan 30 21:35:40 crc kubenswrapper[4914]: E0130 21:35:40.455765 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6477d6d8d2aa0dff14d1edc8986a5f8e4b34cfe14d6f442480c3a05b21d7ff04\": container with ID starting with 6477d6d8d2aa0dff14d1edc8986a5f8e4b34cfe14d6f442480c3a05b21d7ff04 not found: ID does not exist" containerID="6477d6d8d2aa0dff14d1edc8986a5f8e4b34cfe14d6f442480c3a05b21d7ff04" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.455788 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6477d6d8d2aa0dff14d1edc8986a5f8e4b34cfe14d6f442480c3a05b21d7ff04"} err="failed to get container status \"6477d6d8d2aa0dff14d1edc8986a5f8e4b34cfe14d6f442480c3a05b21d7ff04\": rpc error: code = NotFound desc = could not find container \"6477d6d8d2aa0dff14d1edc8986a5f8e4b34cfe14d6f442480c3a05b21d7ff04\": container with ID starting with 6477d6d8d2aa0dff14d1edc8986a5f8e4b34cfe14d6f442480c3a05b21d7ff04 not found: ID does not exist" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.455821 4914 scope.go:117] "RemoveContainer" containerID="237ebdc66e6ff43178fee12f856d64b48e911d8986a371896cc650ecdc3809c8" Jan 30 21:35:40 crc kubenswrapper[4914]: E0130 21:35:40.456152 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"237ebdc66e6ff43178fee12f856d64b48e911d8986a371896cc650ecdc3809c8\": container with ID starting with 237ebdc66e6ff43178fee12f856d64b48e911d8986a371896cc650ecdc3809c8 not found: ID does not exist" containerID="237ebdc66e6ff43178fee12f856d64b48e911d8986a371896cc650ecdc3809c8" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.456168 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237ebdc66e6ff43178fee12f856d64b48e911d8986a371896cc650ecdc3809c8"} err="failed to get container status \"237ebdc66e6ff43178fee12f856d64b48e911d8986a371896cc650ecdc3809c8\": rpc error: code = NotFound desc = could not find container \"237ebdc66e6ff43178fee12f856d64b48e911d8986a371896cc650ecdc3809c8\": container with ID starting with 237ebdc66e6ff43178fee12f856d64b48e911d8986a371896cc650ecdc3809c8 not found: ID does not exist" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.456204 4914 scope.go:117] "RemoveContainer" containerID="f6e9b1492fb6eec2a0336ad396bfbe2c8616fcfd57fbfb1adb2f3b85f1e2e7a5" Jan 30 21:35:40 crc kubenswrapper[4914]: E0130 21:35:40.456449 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6e9b1492fb6eec2a0336ad396bfbe2c8616fcfd57fbfb1adb2f3b85f1e2e7a5\": container with ID starting with f6e9b1492fb6eec2a0336ad396bfbe2c8616fcfd57fbfb1adb2f3b85f1e2e7a5 not found: ID does not exist" containerID="f6e9b1492fb6eec2a0336ad396bfbe2c8616fcfd57fbfb1adb2f3b85f1e2e7a5" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.456483 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6e9b1492fb6eec2a0336ad396bfbe2c8616fcfd57fbfb1adb2f3b85f1e2e7a5"} err="failed to get container status \"f6e9b1492fb6eec2a0336ad396bfbe2c8616fcfd57fbfb1adb2f3b85f1e2e7a5\": rpc error: code = NotFound desc = could not find container \"f6e9b1492fb6eec2a0336ad396bfbe2c8616fcfd57fbfb1adb2f3b85f1e2e7a5\": container with ID starting with f6e9b1492fb6eec2a0336ad396bfbe2c8616fcfd57fbfb1adb2f3b85f1e2e7a5 not found: ID does not exist" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.456508 4914 scope.go:117] "RemoveContainer" containerID="947c4b37c2af5121b8ffad8108987b966bc66f3d7f4b14041a26c5e61078604c" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.463409 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:35:40 crc kubenswrapper[4914]: E0130 21:35:40.466207 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="proxy-httpd" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.466239 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="proxy-httpd" Jan 30 21:35:40 crc kubenswrapper[4914]: E0130 21:35:40.466259 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="sg-core" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.466266 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="sg-core" Jan 30 21:35:40 crc kubenswrapper[4914]: E0130 21:35:40.466287 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="ceilometer-notification-agent" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.466293 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="ceilometer-notification-agent" Jan 30 21:35:40 crc kubenswrapper[4914]: E0130 21:35:40.466315 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="ceilometer-central-agent" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.466322 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="ceilometer-central-agent" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.466511 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="ceilometer-notification-agent" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.466527 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="ceilometer-central-agent" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.466535 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="sg-core" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.466553 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" containerName="proxy-httpd" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.467491 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.471802 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.472342 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.472828 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-hc9kb" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.487273 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.491854 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.493314 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.495090 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.495254 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.508067 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.527797 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.539166 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.549399 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.557983 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.560275 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.561951 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.562097 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.565809 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.573047 4914 scope.go:117] "RemoveContainer" containerID="0a050a81069911fee324bcfa300600734ba13e068a4cfdd947f441e91740437d" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.585274 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.605131 4914 scope.go:117] "RemoveContainer" containerID="47c9c3cf033214c46eb2573058e39b836cf81ac9dfe8ebc208f5186a76e60b4c" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.607113 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-scripts\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.607148 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z48fw\" (UniqueName: \"kubernetes.io/projected/f9407523-2a66-49a6-98d7-a8e53961e788-kube-api-access-z48fw\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.607177 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-config-data\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.607211 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr9tb\" (UniqueName: \"kubernetes.io/projected/9d788ea3-1370-4a64-aff1-d8e2af7c6f94-kube-api-access-gr9tb\") pod \"kube-state-metrics-0\" (UID: \"9d788ea3-1370-4a64-aff1-d8e2af7c6f94\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.607239 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d788ea3-1370-4a64-aff1-d8e2af7c6f94-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9d788ea3-1370-4a64-aff1-d8e2af7c6f94\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.607315 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.607337 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9407523-2a66-49a6-98d7-a8e53961e788-log-httpd\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.607391 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9407523-2a66-49a6-98d7-a8e53961e788-run-httpd\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.607410 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.607425 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9d788ea3-1370-4a64-aff1-d8e2af7c6f94-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9d788ea3-1370-4a64-aff1-d8e2af7c6f94\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.607442 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.607458 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d788ea3-1370-4a64-aff1-d8e2af7c6f94-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9d788ea3-1370-4a64-aff1-d8e2af7c6f94\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.687759 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.709565 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-config-data\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.709609 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z48fw\" (UniqueName: \"kubernetes.io/projected/f9407523-2a66-49a6-98d7-a8e53961e788-kube-api-access-z48fw\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.709629 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.709646 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-scripts\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.709677 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-config-data\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.709735 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr9tb\" (UniqueName: \"kubernetes.io/projected/9d788ea3-1370-4a64-aff1-d8e2af7c6f94-kube-api-access-gr9tb\") pod \"kube-state-metrics-0\" (UID: \"9d788ea3-1370-4a64-aff1-d8e2af7c6f94\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.709758 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.709784 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d788ea3-1370-4a64-aff1-d8e2af7c6f94-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9d788ea3-1370-4a64-aff1-d8e2af7c6f94\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.709806 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.709837 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-config-data-custom\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.709896 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c284840e-6355-4145-9853-723a3d280963-logs\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.709924 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.709945 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9407523-2a66-49a6-98d7-a8e53961e788-log-httpd\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.709978 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqgc\" (UniqueName: \"kubernetes.io/projected/c284840e-6355-4145-9853-723a3d280963-kube-api-access-mjqgc\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.710001 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-scripts\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.710022 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9407523-2a66-49a6-98d7-a8e53961e788-run-httpd\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.710038 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.710056 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9d788ea3-1370-4a64-aff1-d8e2af7c6f94-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9d788ea3-1370-4a64-aff1-d8e2af7c6f94\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.710075 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.710092 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d788ea3-1370-4a64-aff1-d8e2af7c6f94-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9d788ea3-1370-4a64-aff1-d8e2af7c6f94\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.710118 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c284840e-6355-4145-9853-723a3d280963-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.710844 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9407523-2a66-49a6-98d7-a8e53961e788-log-httpd\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.711141 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9407523-2a66-49a6-98d7-a8e53961e788-run-httpd\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.714382 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-config-data\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.714799 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.714828 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.716969 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-scripts\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.717342 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d788ea3-1370-4a64-aff1-d8e2af7c6f94-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9d788ea3-1370-4a64-aff1-d8e2af7c6f94\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.718411 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9d788ea3-1370-4a64-aff1-d8e2af7c6f94-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9d788ea3-1370-4a64-aff1-d8e2af7c6f94\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.718977 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.722067 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d788ea3-1370-4a64-aff1-d8e2af7c6f94-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9d788ea3-1370-4a64-aff1-d8e2af7c6f94\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.730000 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z48fw\" (UniqueName: \"kubernetes.io/projected/f9407523-2a66-49a6-98d7-a8e53961e788-kube-api-access-z48fw\") pod \"ceilometer-0\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.733531 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr9tb\" (UniqueName: \"kubernetes.io/projected/9d788ea3-1370-4a64-aff1-d8e2af7c6f94-kube-api-access-gr9tb\") pod \"kube-state-metrics-0\" (UID: \"9d788ea3-1370-4a64-aff1-d8e2af7c6f94\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.812190 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.812242 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.812270 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-config-data-custom\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.812329 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c284840e-6355-4145-9853-723a3d280963-logs\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.812390 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqgc\" (UniqueName: \"kubernetes.io/projected/c284840e-6355-4145-9853-723a3d280963-kube-api-access-mjqgc\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.812413 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-scripts\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.812449 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c284840e-6355-4145-9853-723a3d280963-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.812471 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-config-data\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.812487 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.812678 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c284840e-6355-4145-9853-723a3d280963-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.813126 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c284840e-6355-4145-9853-723a3d280963-logs\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.817046 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-config-data\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.817536 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-scripts\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.817786 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.818446 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.820673 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.821338 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c284840e-6355-4145-9853-723a3d280963-config-data-custom\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.828684 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqgc\" (UniqueName: \"kubernetes.io/projected/c284840e-6355-4145-9853-723a3d280963-kube-api-access-mjqgc\") pod \"cinder-api-0\" (UID: \"c284840e-6355-4145-9853-723a3d280963\") " pod="openstack/cinder-api-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.898132 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.958446 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:35:40 crc kubenswrapper[4914]: I0130 21:35:40.968259 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.289032 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"d069f103-1546-4a76-963e-2d160d5a347d","Type":"ContainerStarted","Data":"fdb7b1f8d5119dde9dc159739be80d70038da7d53ba436caed816511baa1b067"} Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.289367 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"d069f103-1546-4a76-963e-2d160d5a347d","Type":"ContainerStarted","Data":"aeb5d5d60a33865e48ce033f064f2ec0bbf42395dcefb15cbbd03c38ccf30e66"} Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.300063 4914 generic.go:334] "Generic (PLEG): container finished" podID="af8dbc06-6b83-49c0-9413-56a90165fb97" containerID="cd9fa6fd0ba56738d050e4726d6578839bcda9c1b3fee5ae13e7dd973ed8100a" exitCode=0 Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.301045 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5789d46bdd-5kscc" event={"ID":"af8dbc06-6b83-49c0-9413-56a90165fb97","Type":"ContainerDied","Data":"cd9fa6fd0ba56738d050e4726d6578839bcda9c1b3fee5ae13e7dd973ed8100a"} Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.325177 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.325155165 podStartE2EDuration="2.325155165s" podCreationTimestamp="2026-01-30 21:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:41.314313085 +0000 UTC m=+1274.752949846" watchObservedRunningTime="2026-01-30 21:35:41.325155165 +0000 UTC m=+1274.763791926" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.347500 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.414620491 podStartE2EDuration="24.34748157s" podCreationTimestamp="2026-01-30 21:35:17 +0000 UTC" firstStartedPulling="2026-01-30 21:35:18.543565524 +0000 UTC m=+1251.982202285" lastFinishedPulling="2026-01-30 21:35:39.476426603 +0000 UTC m=+1272.915063364" observedRunningTime="2026-01-30 21:35:41.343931975 +0000 UTC m=+1274.782568736" watchObservedRunningTime="2026-01-30 21:35:41.34748157 +0000 UTC m=+1274.786118341" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.387755 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-ch6mc"] Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.389359 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ch6mc" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.406259 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ch6mc"] Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.418052 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:35:41 crc kubenswrapper[4914]: W0130 21:35:41.429755 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d788ea3_1370_4a64_aff1_d8e2af7c6f94.slice/crio-57ed499bc0fb2dfb98afc917dda7a834fca6686e4e8094b75c1712bbd2e703a4 WatchSource:0}: Error finding container 57ed499bc0fb2dfb98afc917dda7a834fca6686e4e8094b75c1712bbd2e703a4: Status 404 returned error can't find the container with id 57ed499bc0fb2dfb98afc917dda7a834fca6686e4e8094b75c1712bbd2e703a4 Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.487006 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-f69w2"] Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.488618 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f69w2" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.499528 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f69w2"] Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.554974 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcfcm\" (UniqueName: \"kubernetes.io/projected/40e7c831-3c33-429a-ac8a-e7768226c344-kube-api-access-wcfcm\") pod \"nova-api-db-create-ch6mc\" (UID: \"40e7c831-3c33-429a-ac8a-e7768226c344\") " pod="openstack/nova-api-db-create-ch6mc" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.555083 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40e7c831-3c33-429a-ac8a-e7768226c344-operator-scripts\") pod \"nova-api-db-create-ch6mc\" (UID: \"40e7c831-3c33-429a-ac8a-e7768226c344\") " pod="openstack/nova-api-db-create-ch6mc" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.591377 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-dpz67"] Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.592646 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dpz67" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.600834 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dpz67"] Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.607118 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-a41a-account-create-update-55jbl"] Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.608308 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a41a-account-create-update-55jbl" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.610534 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.629872 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a41a-account-create-update-55jbl"] Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.656872 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40e7c831-3c33-429a-ac8a-e7768226c344-operator-scripts\") pod \"nova-api-db-create-ch6mc\" (UID: \"40e7c831-3c33-429a-ac8a-e7768226c344\") " pod="openstack/nova-api-db-create-ch6mc" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.656959 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5edf659-0359-4a9c-aca0-d95f3ac8c57e-operator-scripts\") pod \"nova-cell0-db-create-f69w2\" (UID: \"e5edf659-0359-4a9c-aca0-d95f3ac8c57e\") " pod="openstack/nova-cell0-db-create-f69w2" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.657038 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls425\" (UniqueName: \"kubernetes.io/projected/e5edf659-0359-4a9c-aca0-d95f3ac8c57e-kube-api-access-ls425\") pod \"nova-cell0-db-create-f69w2\" (UID: \"e5edf659-0359-4a9c-aca0-d95f3ac8c57e\") " pod="openstack/nova-cell0-db-create-f69w2" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.657114 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcfcm\" (UniqueName: \"kubernetes.io/projected/40e7c831-3c33-429a-ac8a-e7768226c344-kube-api-access-wcfcm\") pod \"nova-api-db-create-ch6mc\" (UID: \"40e7c831-3c33-429a-ac8a-e7768226c344\") " pod="openstack/nova-api-db-create-ch6mc" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.674488 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40e7c831-3c33-429a-ac8a-e7768226c344-operator-scripts\") pod \"nova-api-db-create-ch6mc\" (UID: \"40e7c831-3c33-429a-ac8a-e7768226c344\") " pod="openstack/nova-api-db-create-ch6mc" Jan 30 21:35:41 crc kubenswrapper[4914]: W0130 21:35:41.696015 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9407523_2a66_49a6_98d7_a8e53961e788.slice/crio-17e32a51aaa9c4b1fdbb91a8061d4a9bdd32912ea282a567089812a6881cb08b WatchSource:0}: Error finding container 17e32a51aaa9c4b1fdbb91a8061d4a9bdd32912ea282a567089812a6881cb08b: Status 404 returned error can't find the container with id 17e32a51aaa9c4b1fdbb91a8061d4a9bdd32912ea282a567089812a6881cb08b Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.697640 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcfcm\" (UniqueName: \"kubernetes.io/projected/40e7c831-3c33-429a-ac8a-e7768226c344-kube-api-access-wcfcm\") pod \"nova-api-db-create-ch6mc\" (UID: \"40e7c831-3c33-429a-ac8a-e7768226c344\") " pod="openstack/nova-api-db-create-ch6mc" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.722381 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ch6mc" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.735342 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.760003 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5edf659-0359-4a9c-aca0-d95f3ac8c57e-operator-scripts\") pod \"nova-cell0-db-create-f69w2\" (UID: \"e5edf659-0359-4a9c-aca0-d95f3ac8c57e\") " pod="openstack/nova-cell0-db-create-f69w2" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.760072 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9tb9\" (UniqueName: \"kubernetes.io/projected/d2dbf656-0687-4199-8101-c25fb82801e8-kube-api-access-m9tb9\") pod \"nova-api-a41a-account-create-update-55jbl\" (UID: \"d2dbf656-0687-4199-8101-c25fb82801e8\") " pod="openstack/nova-api-a41a-account-create-update-55jbl" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.760108 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db5c9d5a-21ac-4c8c-a108-c6752014ec58-operator-scripts\") pod \"nova-cell1-db-create-dpz67\" (UID: \"db5c9d5a-21ac-4c8c-a108-c6752014ec58\") " pod="openstack/nova-cell1-db-create-dpz67" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.760142 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls425\" (UniqueName: \"kubernetes.io/projected/e5edf659-0359-4a9c-aca0-d95f3ac8c57e-kube-api-access-ls425\") pod \"nova-cell0-db-create-f69w2\" (UID: \"e5edf659-0359-4a9c-aca0-d95f3ac8c57e\") " pod="openstack/nova-cell0-db-create-f69w2" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.760223 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2dbf656-0687-4199-8101-c25fb82801e8-operator-scripts\") pod \"nova-api-a41a-account-create-update-55jbl\" (UID: \"d2dbf656-0687-4199-8101-c25fb82801e8\") " pod="openstack/nova-api-a41a-account-create-update-55jbl" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.760239 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8l5g\" (UniqueName: \"kubernetes.io/projected/db5c9d5a-21ac-4c8c-a108-c6752014ec58-kube-api-access-w8l5g\") pod \"nova-cell1-db-create-dpz67\" (UID: \"db5c9d5a-21ac-4c8c-a108-c6752014ec58\") " pod="openstack/nova-cell1-db-create-dpz67" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.761173 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5edf659-0359-4a9c-aca0-d95f3ac8c57e-operator-scripts\") pod \"nova-cell0-db-create-f69w2\" (UID: \"e5edf659-0359-4a9c-aca0-d95f3ac8c57e\") " pod="openstack/nova-cell0-db-create-f69w2" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.774400 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.779173 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls425\" (UniqueName: \"kubernetes.io/projected/e5edf659-0359-4a9c-aca0-d95f3ac8c57e-kube-api-access-ls425\") pod \"nova-cell0-db-create-f69w2\" (UID: \"e5edf659-0359-4a9c-aca0-d95f3ac8c57e\") " pod="openstack/nova-cell0-db-create-f69w2" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.799478 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2c59-account-create-update-ndrfd"] Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.801465 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2c59-account-create-update-ndrfd" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.803592 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.821850 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2c59-account-create-update-ndrfd"] Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.831836 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f69w2" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.861777 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9tb9\" (UniqueName: \"kubernetes.io/projected/d2dbf656-0687-4199-8101-c25fb82801e8-kube-api-access-m9tb9\") pod \"nova-api-a41a-account-create-update-55jbl\" (UID: \"d2dbf656-0687-4199-8101-c25fb82801e8\") " pod="openstack/nova-api-a41a-account-create-update-55jbl" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.861817 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db5c9d5a-21ac-4c8c-a108-c6752014ec58-operator-scripts\") pod \"nova-cell1-db-create-dpz67\" (UID: \"db5c9d5a-21ac-4c8c-a108-c6752014ec58\") " pod="openstack/nova-cell1-db-create-dpz67" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.861911 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2dbf656-0687-4199-8101-c25fb82801e8-operator-scripts\") pod \"nova-api-a41a-account-create-update-55jbl\" (UID: \"d2dbf656-0687-4199-8101-c25fb82801e8\") " pod="openstack/nova-api-a41a-account-create-update-55jbl" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.861929 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8l5g\" (UniqueName: \"kubernetes.io/projected/db5c9d5a-21ac-4c8c-a108-c6752014ec58-kube-api-access-w8l5g\") pod \"nova-cell1-db-create-dpz67\" (UID: \"db5c9d5a-21ac-4c8c-a108-c6752014ec58\") " pod="openstack/nova-cell1-db-create-dpz67" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.862894 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db5c9d5a-21ac-4c8c-a108-c6752014ec58-operator-scripts\") pod \"nova-cell1-db-create-dpz67\" (UID: \"db5c9d5a-21ac-4c8c-a108-c6752014ec58\") " pod="openstack/nova-cell1-db-create-dpz67" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.863426 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2dbf656-0687-4199-8101-c25fb82801e8-operator-scripts\") pod \"nova-api-a41a-account-create-update-55jbl\" (UID: \"d2dbf656-0687-4199-8101-c25fb82801e8\") " pod="openstack/nova-api-a41a-account-create-update-55jbl" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.866185 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="134b35c4-3656-4890-8cb2-76bc09779403" path="/var/lib/kubelet/pods/134b35c4-3656-4890-8cb2-76bc09779403/volumes" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.867061 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ec90be3-5dfd-48aa-934c-70ef856a51c5" path="/var/lib/kubelet/pods/1ec90be3-5dfd-48aa-934c-70ef856a51c5/volumes" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.867696 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63f3f352-1ffb-48b4-b985-0d2d2206c7c1" path="/var/lib/kubelet/pods/63f3f352-1ffb-48b4-b985-0d2d2206c7c1/volumes" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.877271 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8l5g\" (UniqueName: \"kubernetes.io/projected/db5c9d5a-21ac-4c8c-a108-c6752014ec58-kube-api-access-w8l5g\") pod \"nova-cell1-db-create-dpz67\" (UID: \"db5c9d5a-21ac-4c8c-a108-c6752014ec58\") " pod="openstack/nova-cell1-db-create-dpz67" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.877916 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9tb9\" (UniqueName: \"kubernetes.io/projected/d2dbf656-0687-4199-8101-c25fb82801e8-kube-api-access-m9tb9\") pod \"nova-api-a41a-account-create-update-55jbl\" (UID: \"d2dbf656-0687-4199-8101-c25fb82801e8\") " pod="openstack/nova-api-a41a-account-create-update-55jbl" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.963328 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a7e1e76-75e3-4193-b755-1b044debf71f-operator-scripts\") pod \"nova-cell0-2c59-account-create-update-ndrfd\" (UID: \"3a7e1e76-75e3-4193-b755-1b044debf71f\") " pod="openstack/nova-cell0-2c59-account-create-update-ndrfd" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.963633 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znt95\" (UniqueName: \"kubernetes.io/projected/3a7e1e76-75e3-4193-b755-1b044debf71f-kube-api-access-znt95\") pod \"nova-cell0-2c59-account-create-update-ndrfd\" (UID: \"3a7e1e76-75e3-4193-b755-1b044debf71f\") " pod="openstack/nova-cell0-2c59-account-create-update-ndrfd" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.968383 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9ea2-account-create-update-7nt6g"] Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.969980 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9ea2-account-create-update-7nt6g" Jan 30 21:35:41 crc kubenswrapper[4914]: I0130 21:35:41.984888 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.001236 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9ea2-account-create-update-7nt6g"] Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.065530 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flt94\" (UniqueName: \"kubernetes.io/projected/5bed7ff8-1ca4-47d8-bb12-a13840612182-kube-api-access-flt94\") pod \"nova-cell1-9ea2-account-create-update-7nt6g\" (UID: \"5bed7ff8-1ca4-47d8-bb12-a13840612182\") " pod="openstack/nova-cell1-9ea2-account-create-update-7nt6g" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.065785 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bed7ff8-1ca4-47d8-bb12-a13840612182-operator-scripts\") pod \"nova-cell1-9ea2-account-create-update-7nt6g\" (UID: \"5bed7ff8-1ca4-47d8-bb12-a13840612182\") " pod="openstack/nova-cell1-9ea2-account-create-update-7nt6g" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.065846 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a7e1e76-75e3-4193-b755-1b044debf71f-operator-scripts\") pod \"nova-cell0-2c59-account-create-update-ndrfd\" (UID: \"3a7e1e76-75e3-4193-b755-1b044debf71f\") " pod="openstack/nova-cell0-2c59-account-create-update-ndrfd" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.065908 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znt95\" (UniqueName: \"kubernetes.io/projected/3a7e1e76-75e3-4193-b755-1b044debf71f-kube-api-access-znt95\") pod \"nova-cell0-2c59-account-create-update-ndrfd\" (UID: \"3a7e1e76-75e3-4193-b755-1b044debf71f\") " pod="openstack/nova-cell0-2c59-account-create-update-ndrfd" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.066574 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a7e1e76-75e3-4193-b755-1b044debf71f-operator-scripts\") pod \"nova-cell0-2c59-account-create-update-ndrfd\" (UID: \"3a7e1e76-75e3-4193-b755-1b044debf71f\") " pod="openstack/nova-cell0-2c59-account-create-update-ndrfd" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.089121 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znt95\" (UniqueName: \"kubernetes.io/projected/3a7e1e76-75e3-4193-b755-1b044debf71f-kube-api-access-znt95\") pod \"nova-cell0-2c59-account-create-update-ndrfd\" (UID: \"3a7e1e76-75e3-4193-b755-1b044debf71f\") " pod="openstack/nova-cell0-2c59-account-create-update-ndrfd" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.156225 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dpz67" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.171248 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a41a-account-create-update-55jbl" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.172092 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bed7ff8-1ca4-47d8-bb12-a13840612182-operator-scripts\") pod \"nova-cell1-9ea2-account-create-update-7nt6g\" (UID: \"5bed7ff8-1ca4-47d8-bb12-a13840612182\") " pod="openstack/nova-cell1-9ea2-account-create-update-7nt6g" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.172175 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flt94\" (UniqueName: \"kubernetes.io/projected/5bed7ff8-1ca4-47d8-bb12-a13840612182-kube-api-access-flt94\") pod \"nova-cell1-9ea2-account-create-update-7nt6g\" (UID: \"5bed7ff8-1ca4-47d8-bb12-a13840612182\") " pod="openstack/nova-cell1-9ea2-account-create-update-7nt6g" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.173039 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bed7ff8-1ca4-47d8-bb12-a13840612182-operator-scripts\") pod \"nova-cell1-9ea2-account-create-update-7nt6g\" (UID: \"5bed7ff8-1ca4-47d8-bb12-a13840612182\") " pod="openstack/nova-cell1-9ea2-account-create-update-7nt6g" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.184619 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2c59-account-create-update-ndrfd" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.216564 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flt94\" (UniqueName: \"kubernetes.io/projected/5bed7ff8-1ca4-47d8-bb12-a13840612182-kube-api-access-flt94\") pod \"nova-cell1-9ea2-account-create-update-7nt6g\" (UID: \"5bed7ff8-1ca4-47d8-bb12-a13840612182\") " pod="openstack/nova-cell1-9ea2-account-create-update-7nt6g" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.310331 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9ea2-account-create-update-7nt6g" Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.318936 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9d788ea3-1370-4a64-aff1-d8e2af7c6f94","Type":"ContainerStarted","Data":"57ed499bc0fb2dfb98afc917dda7a834fca6686e4e8094b75c1712bbd2e703a4"} Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.347732 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c284840e-6355-4145-9853-723a3d280963","Type":"ContainerStarted","Data":"2f592bb64ea287894f6c9763e80f2e80dd31e256300ad330972ad782ac01475f"} Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.360518 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9407523-2a66-49a6-98d7-a8e53961e788","Type":"ContainerStarted","Data":"17e32a51aaa9c4b1fdbb91a8061d4a9bdd32912ea282a567089812a6881cb08b"} Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.395506 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ch6mc"] Jan 30 21:35:42 crc kubenswrapper[4914]: W0130 21:35:42.396065 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5edf659_0359_4a9c_aca0_d95f3ac8c57e.slice/crio-4b03843798fdd7ed931ea1b203cf928f31f34f40389456a3b27d1d274e4e32fe WatchSource:0}: Error finding container 4b03843798fdd7ed931ea1b203cf928f31f34f40389456a3b27d1d274e4e32fe: Status 404 returned error can't find the container with id 4b03843798fdd7ed931ea1b203cf928f31f34f40389456a3b27d1d274e4e32fe Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.418126 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f69w2"] Jan 30 21:35:42 crc kubenswrapper[4914]: W0130 21:35:42.900301 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a7e1e76_75e3_4193_b755_1b044debf71f.slice/crio-59c88b13a1c73dfdbdabae693e73b92d41b46a727cd65520b2cc50855bd0e79d WatchSource:0}: Error finding container 59c88b13a1c73dfdbdabae693e73b92d41b46a727cd65520b2cc50855bd0e79d: Status 404 returned error can't find the container with id 59c88b13a1c73dfdbdabae693e73b92d41b46a727cd65520b2cc50855bd0e79d Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.901960 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2c59-account-create-update-ndrfd"] Jan 30 21:35:42 crc kubenswrapper[4914]: I0130 21:35:42.950569 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dpz67"] Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.057069 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9ea2-account-create-update-7nt6g"] Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.070693 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a41a-account-create-update-55jbl"] Jan 30 21:35:43 crc kubenswrapper[4914]: W0130 21:35:43.100978 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2dbf656_0687_4199_8101_c25fb82801e8.slice/crio-31603286ab6e401e5179a4557405c8ebc16952ace57baca0ccff4e2f8ac98ddb WatchSource:0}: Error finding container 31603286ab6e401e5179a4557405c8ebc16952ace57baca0ccff4e2f8ac98ddb: Status 404 returned error can't find the container with id 31603286ab6e401e5179a4557405c8ebc16952ace57baca0ccff4e2f8ac98ddb Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.381677 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c284840e-6355-4145-9853-723a3d280963","Type":"ContainerStarted","Data":"f7ed1a40483f073a89414f23a677d85a817c8b14420a8a2b2fe320f5a98fa9e8"} Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.383337 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9ea2-account-create-update-7nt6g" event={"ID":"5bed7ff8-1ca4-47d8-bb12-a13840612182","Type":"ContainerStarted","Data":"dd6b9c37176152cf91804cecaf355f7f1166cf9bf26ce350da4f59eff01e3db5"} Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.385938 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ch6mc" event={"ID":"40e7c831-3c33-429a-ac8a-e7768226c344","Type":"ContainerStarted","Data":"4181bf5d67c4491e0b98ab6194e9b73e98dbf6f8d7cb4d2fb6b29c00602ebee0"} Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.385982 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ch6mc" event={"ID":"40e7c831-3c33-429a-ac8a-e7768226c344","Type":"ContainerStarted","Data":"58e77663c6ac28870b134e5e33cb564d9d4389e6fb889ee8099d17eec3d7fe63"} Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.394200 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9d788ea3-1370-4a64-aff1-d8e2af7c6f94","Type":"ContainerStarted","Data":"e06c5a35f3086a4819e6cc2dda0177d4326400b3e65bcc7c8e9dd1dc9fd755e1"} Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.394366 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.404077 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2c59-account-create-update-ndrfd" event={"ID":"3a7e1e76-75e3-4193-b755-1b044debf71f","Type":"ContainerStarted","Data":"59c88b13a1c73dfdbdabae693e73b92d41b46a727cd65520b2cc50855bd0e79d"} Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.404850 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-ch6mc" podStartSLOduration=2.4048356 podStartE2EDuration="2.4048356s" podCreationTimestamp="2026-01-30 21:35:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:43.402084944 +0000 UTC m=+1276.840721705" watchObservedRunningTime="2026-01-30 21:35:43.4048356 +0000 UTC m=+1276.843472361" Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.414674 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a41a-account-create-update-55jbl" event={"ID":"d2dbf656-0687-4199-8101-c25fb82801e8","Type":"ContainerStarted","Data":"31603286ab6e401e5179a4557405c8ebc16952ace57baca0ccff4e2f8ac98ddb"} Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.422351 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f69w2" event={"ID":"e5edf659-0359-4a9c-aca0-d95f3ac8c57e","Type":"ContainerStarted","Data":"c89079d181f700e285a12c4da14f91a2f92d5183f8b8a9118aa4c3e9477fc6b4"} Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.422396 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f69w2" event={"ID":"e5edf659-0359-4a9c-aca0-d95f3ac8c57e","Type":"ContainerStarted","Data":"4b03843798fdd7ed931ea1b203cf928f31f34f40389456a3b27d1d274e4e32fe"} Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.435820 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dpz67" event={"ID":"db5c9d5a-21ac-4c8c-a108-c6752014ec58","Type":"ContainerStarted","Data":"847a41e77b96dbc05c021809356b8fcb83c5927cf6afe6543b051807450bb93e"} Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.441060 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.151944567 podStartE2EDuration="3.441042828s" podCreationTimestamp="2026-01-30 21:35:40 +0000 UTC" firstStartedPulling="2026-01-30 21:35:41.437433505 +0000 UTC m=+1274.876070266" lastFinishedPulling="2026-01-30 21:35:42.726531766 +0000 UTC m=+1276.165168527" observedRunningTime="2026-01-30 21:35:43.432329389 +0000 UTC m=+1276.870966150" watchObservedRunningTime="2026-01-30 21:35:43.441042828 +0000 UTC m=+1276.879679579" Jan 30 21:35:43 crc kubenswrapper[4914]: I0130 21:35:43.471955 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-f69w2" podStartSLOduration=2.471934638 podStartE2EDuration="2.471934638s" podCreationTimestamp="2026-01-30 21:35:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:43.460111495 +0000 UTC m=+1276.898748256" watchObservedRunningTime="2026-01-30 21:35:43.471934638 +0000 UTC m=+1276.910571399" Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.304920 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="1ec90be3-5dfd-48aa-934c-70ef856a51c5" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.183:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.448117 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9407523-2a66-49a6-98d7-a8e53961e788","Type":"ContainerStarted","Data":"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957"} Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.450718 4914 generic.go:334] "Generic (PLEG): container finished" podID="40e7c831-3c33-429a-ac8a-e7768226c344" containerID="4181bf5d67c4491e0b98ab6194e9b73e98dbf6f8d7cb4d2fb6b29c00602ebee0" exitCode=0 Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.450783 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ch6mc" event={"ID":"40e7c831-3c33-429a-ac8a-e7768226c344","Type":"ContainerDied","Data":"4181bf5d67c4491e0b98ab6194e9b73e98dbf6f8d7cb4d2fb6b29c00602ebee0"} Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.452887 4914 generic.go:334] "Generic (PLEG): container finished" podID="3a7e1e76-75e3-4193-b755-1b044debf71f" containerID="8ce7ef54868ef9255cfb283abe8c8c4e449f44ab869cda4d58d7586213af5475" exitCode=0 Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.452938 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2c59-account-create-update-ndrfd" event={"ID":"3a7e1e76-75e3-4193-b755-1b044debf71f","Type":"ContainerDied","Data":"8ce7ef54868ef9255cfb283abe8c8c4e449f44ab869cda4d58d7586213af5475"} Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.454559 4914 generic.go:334] "Generic (PLEG): container finished" podID="d2dbf656-0687-4199-8101-c25fb82801e8" containerID="f92dcde47eb4f286d8ab920088365901dd6531f26bf0caa94de77959d2a73815" exitCode=0 Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.454621 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a41a-account-create-update-55jbl" event={"ID":"d2dbf656-0687-4199-8101-c25fb82801e8","Type":"ContainerDied","Data":"f92dcde47eb4f286d8ab920088365901dd6531f26bf0caa94de77959d2a73815"} Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.455991 4914 generic.go:334] "Generic (PLEG): container finished" podID="e5edf659-0359-4a9c-aca0-d95f3ac8c57e" containerID="c89079d181f700e285a12c4da14f91a2f92d5183f8b8a9118aa4c3e9477fc6b4" exitCode=0 Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.456026 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f69w2" event={"ID":"e5edf659-0359-4a9c-aca0-d95f3ac8c57e","Type":"ContainerDied","Data":"c89079d181f700e285a12c4da14f91a2f92d5183f8b8a9118aa4c3e9477fc6b4"} Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.459078 4914 generic.go:334] "Generic (PLEG): container finished" podID="db5c9d5a-21ac-4c8c-a108-c6752014ec58" containerID="776fac43153567a3aa1030f424c12e091b83ad9827ad7d122a14748bcce990d5" exitCode=0 Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.459135 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dpz67" event={"ID":"db5c9d5a-21ac-4c8c-a108-c6752014ec58","Type":"ContainerDied","Data":"776fac43153567a3aa1030f424c12e091b83ad9827ad7d122a14748bcce990d5"} Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.460903 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c284840e-6355-4145-9853-723a3d280963","Type":"ContainerStarted","Data":"cd6fb35189215e86f0db564f077c5e41b00600cfc9735ec1609e9089e713e495"} Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.461755 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.465927 4914 generic.go:334] "Generic (PLEG): container finished" podID="5bed7ff8-1ca4-47d8-bb12-a13840612182" containerID="54f09901f3d592d392b9f60149550d4d8d9b2c954734d003718dedf616083feb" exitCode=0 Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.465993 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9ea2-account-create-update-7nt6g" event={"ID":"5bed7ff8-1ca4-47d8-bb12-a13840612182","Type":"ContainerDied","Data":"54f09901f3d592d392b9f60149550d4d8d9b2c954734d003718dedf616083feb"} Jan 30 21:35:44 crc kubenswrapper[4914]: I0130 21:35:44.562770 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.562755728 podStartE2EDuration="4.562755728s" podCreationTimestamp="2026-01-30 21:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:44.561028177 +0000 UTC m=+1277.999664938" watchObservedRunningTime="2026-01-30 21:35:44.562755728 +0000 UTC m=+1278.001392489" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.071190 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dpz67" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.224512 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8l5g\" (UniqueName: \"kubernetes.io/projected/db5c9d5a-21ac-4c8c-a108-c6752014ec58-kube-api-access-w8l5g\") pod \"db5c9d5a-21ac-4c8c-a108-c6752014ec58\" (UID: \"db5c9d5a-21ac-4c8c-a108-c6752014ec58\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.224797 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db5c9d5a-21ac-4c8c-a108-c6752014ec58-operator-scripts\") pod \"db5c9d5a-21ac-4c8c-a108-c6752014ec58\" (UID: \"db5c9d5a-21ac-4c8c-a108-c6752014ec58\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.225823 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db5c9d5a-21ac-4c8c-a108-c6752014ec58-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db5c9d5a-21ac-4c8c-a108-c6752014ec58" (UID: "db5c9d5a-21ac-4c8c-a108-c6752014ec58"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.239039 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db5c9d5a-21ac-4c8c-a108-c6752014ec58-kube-api-access-w8l5g" (OuterVolumeSpecName: "kube-api-access-w8l5g") pod "db5c9d5a-21ac-4c8c-a108-c6752014ec58" (UID: "db5c9d5a-21ac-4c8c-a108-c6752014ec58"). InnerVolumeSpecName "kube-api-access-w8l5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.283470 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ch6mc" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.311139 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a41a-account-create-update-55jbl" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.314576 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2c59-account-create-update-ndrfd" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.326859 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db5c9d5a-21ac-4c8c-a108-c6752014ec58-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.326892 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8l5g\" (UniqueName: \"kubernetes.io/projected/db5c9d5a-21ac-4c8c-a108-c6752014ec58-kube-api-access-w8l5g\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.364671 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9ea2-account-create-update-7nt6g" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.414626 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f69w2" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.429062 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcfcm\" (UniqueName: \"kubernetes.io/projected/40e7c831-3c33-429a-ac8a-e7768226c344-kube-api-access-wcfcm\") pod \"40e7c831-3c33-429a-ac8a-e7768226c344\" (UID: \"40e7c831-3c33-429a-ac8a-e7768226c344\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.429201 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40e7c831-3c33-429a-ac8a-e7768226c344-operator-scripts\") pod \"40e7c831-3c33-429a-ac8a-e7768226c344\" (UID: \"40e7c831-3c33-429a-ac8a-e7768226c344\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.429419 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a7e1e76-75e3-4193-b755-1b044debf71f-operator-scripts\") pod \"3a7e1e76-75e3-4193-b755-1b044debf71f\" (UID: \"3a7e1e76-75e3-4193-b755-1b044debf71f\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.429454 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2dbf656-0687-4199-8101-c25fb82801e8-operator-scripts\") pod \"d2dbf656-0687-4199-8101-c25fb82801e8\" (UID: \"d2dbf656-0687-4199-8101-c25fb82801e8\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.429551 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znt95\" (UniqueName: \"kubernetes.io/projected/3a7e1e76-75e3-4193-b755-1b044debf71f-kube-api-access-znt95\") pod \"3a7e1e76-75e3-4193-b755-1b044debf71f\" (UID: \"3a7e1e76-75e3-4193-b755-1b044debf71f\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.429656 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9tb9\" (UniqueName: \"kubernetes.io/projected/d2dbf656-0687-4199-8101-c25fb82801e8-kube-api-access-m9tb9\") pod \"d2dbf656-0687-4199-8101-c25fb82801e8\" (UID: \"d2dbf656-0687-4199-8101-c25fb82801e8\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.430283 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2dbf656-0687-4199-8101-c25fb82801e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d2dbf656-0687-4199-8101-c25fb82801e8" (UID: "d2dbf656-0687-4199-8101-c25fb82801e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.431613 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40e7c831-3c33-429a-ac8a-e7768226c344-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "40e7c831-3c33-429a-ac8a-e7768226c344" (UID: "40e7c831-3c33-429a-ac8a-e7768226c344"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.432859 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a7e1e76-75e3-4193-b755-1b044debf71f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3a7e1e76-75e3-4193-b755-1b044debf71f" (UID: "3a7e1e76-75e3-4193-b755-1b044debf71f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.434916 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a7e1e76-75e3-4193-b755-1b044debf71f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.434942 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2dbf656-0687-4199-8101-c25fb82801e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.434954 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40e7c831-3c33-429a-ac8a-e7768226c344-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.441576 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e7c831-3c33-429a-ac8a-e7768226c344-kube-api-access-wcfcm" (OuterVolumeSpecName: "kube-api-access-wcfcm") pod "40e7c831-3c33-429a-ac8a-e7768226c344" (UID: "40e7c831-3c33-429a-ac8a-e7768226c344"). InnerVolumeSpecName "kube-api-access-wcfcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.441638 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2dbf656-0687-4199-8101-c25fb82801e8-kube-api-access-m9tb9" (OuterVolumeSpecName: "kube-api-access-m9tb9") pod "d2dbf656-0687-4199-8101-c25fb82801e8" (UID: "d2dbf656-0687-4199-8101-c25fb82801e8"). InnerVolumeSpecName "kube-api-access-m9tb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.441668 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a7e1e76-75e3-4193-b755-1b044debf71f-kube-api-access-znt95" (OuterVolumeSpecName: "kube-api-access-znt95") pod "3a7e1e76-75e3-4193-b755-1b044debf71f" (UID: "3a7e1e76-75e3-4193-b755-1b044debf71f"). InnerVolumeSpecName "kube-api-access-znt95". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.516691 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f69w2" event={"ID":"e5edf659-0359-4a9c-aca0-d95f3ac8c57e","Type":"ContainerDied","Data":"4b03843798fdd7ed931ea1b203cf928f31f34f40389456a3b27d1d274e4e32fe"} Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.516742 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b03843798fdd7ed931ea1b203cf928f31f34f40389456a3b27d1d274e4e32fe" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.516794 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f69w2" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.526951 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dpz67" event={"ID":"db5c9d5a-21ac-4c8c-a108-c6752014ec58","Type":"ContainerDied","Data":"847a41e77b96dbc05c021809356b8fcb83c5927cf6afe6543b051807450bb93e"} Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.526990 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="847a41e77b96dbc05c021809356b8fcb83c5927cf6afe6543b051807450bb93e" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.527052 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dpz67" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.536476 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5edf659-0359-4a9c-aca0-d95f3ac8c57e-operator-scripts\") pod \"e5edf659-0359-4a9c-aca0-d95f3ac8c57e\" (UID: \"e5edf659-0359-4a9c-aca0-d95f3ac8c57e\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.536580 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9ea2-account-create-update-7nt6g" event={"ID":"5bed7ff8-1ca4-47d8-bb12-a13840612182","Type":"ContainerDied","Data":"dd6b9c37176152cf91804cecaf355f7f1166cf9bf26ce350da4f59eff01e3db5"} Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.536605 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd6b9c37176152cf91804cecaf355f7f1166cf9bf26ce350da4f59eff01e3db5" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.536621 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flt94\" (UniqueName: \"kubernetes.io/projected/5bed7ff8-1ca4-47d8-bb12-a13840612182-kube-api-access-flt94\") pod \"5bed7ff8-1ca4-47d8-bb12-a13840612182\" (UID: \"5bed7ff8-1ca4-47d8-bb12-a13840612182\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.536659 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls425\" (UniqueName: \"kubernetes.io/projected/e5edf659-0359-4a9c-aca0-d95f3ac8c57e-kube-api-access-ls425\") pod \"e5edf659-0359-4a9c-aca0-d95f3ac8c57e\" (UID: \"e5edf659-0359-4a9c-aca0-d95f3ac8c57e\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.536666 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9ea2-account-create-update-7nt6g" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.536692 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bed7ff8-1ca4-47d8-bb12-a13840612182-operator-scripts\") pod \"5bed7ff8-1ca4-47d8-bb12-a13840612182\" (UID: \"5bed7ff8-1ca4-47d8-bb12-a13840612182\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.537273 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcfcm\" (UniqueName: \"kubernetes.io/projected/40e7c831-3c33-429a-ac8a-e7768226c344-kube-api-access-wcfcm\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.537292 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znt95\" (UniqueName: \"kubernetes.io/projected/3a7e1e76-75e3-4193-b755-1b044debf71f-kube-api-access-znt95\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.537302 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9tb9\" (UniqueName: \"kubernetes.io/projected/d2dbf656-0687-4199-8101-c25fb82801e8-kube-api-access-m9tb9\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.538542 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5edf659-0359-4a9c-aca0-d95f3ac8c57e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5edf659-0359-4a9c-aca0-d95f3ac8c57e" (UID: "e5edf659-0359-4a9c-aca0-d95f3ac8c57e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.539282 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bed7ff8-1ca4-47d8-bb12-a13840612182-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5bed7ff8-1ca4-47d8-bb12-a13840612182" (UID: "5bed7ff8-1ca4-47d8-bb12-a13840612182"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.541872 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bed7ff8-1ca4-47d8-bb12-a13840612182-kube-api-access-flt94" (OuterVolumeSpecName: "kube-api-access-flt94") pod "5bed7ff8-1ca4-47d8-bb12-a13840612182" (UID: "5bed7ff8-1ca4-47d8-bb12-a13840612182"). InnerVolumeSpecName "kube-api-access-flt94". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.542770 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5edf659-0359-4a9c-aca0-d95f3ac8c57e-kube-api-access-ls425" (OuterVolumeSpecName: "kube-api-access-ls425") pod "e5edf659-0359-4a9c-aca0-d95f3ac8c57e" (UID: "e5edf659-0359-4a9c-aca0-d95f3ac8c57e"). InnerVolumeSpecName "kube-api-access-ls425". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.548191 4914 generic.go:334] "Generic (PLEG): container finished" podID="af8dbc06-6b83-49c0-9413-56a90165fb97" containerID="0195835ed917d263a825d5070d67bbb21d31e2b7db9444936980af6a4d7a30e6" exitCode=0 Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.548271 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5789d46bdd-5kscc" event={"ID":"af8dbc06-6b83-49c0-9413-56a90165fb97","Type":"ContainerDied","Data":"0195835ed917d263a825d5070d67bbb21d31e2b7db9444936980af6a4d7a30e6"} Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.555431 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9407523-2a66-49a6-98d7-a8e53961e788","Type":"ContainerStarted","Data":"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a"} Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.557176 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ch6mc" event={"ID":"40e7c831-3c33-429a-ac8a-e7768226c344","Type":"ContainerDied","Data":"58e77663c6ac28870b134e5e33cb564d9d4389e6fb889ee8099d17eec3d7fe63"} Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.557273 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58e77663c6ac28870b134e5e33cb564d9d4389e6fb889ee8099d17eec3d7fe63" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.557429 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ch6mc" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.570956 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2c59-account-create-update-ndrfd" event={"ID":"3a7e1e76-75e3-4193-b755-1b044debf71f","Type":"ContainerDied","Data":"59c88b13a1c73dfdbdabae693e73b92d41b46a727cd65520b2cc50855bd0e79d"} Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.570990 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59c88b13a1c73dfdbdabae693e73b92d41b46a727cd65520b2cc50855bd0e79d" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.571103 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2c59-account-create-update-ndrfd" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.575052 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.576904 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a41a-account-create-update-55jbl" event={"ID":"d2dbf656-0687-4199-8101-c25fb82801e8","Type":"ContainerDied","Data":"31603286ab6e401e5179a4557405c8ebc16952ace57baca0ccff4e2f8ac98ddb"} Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.576941 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31603286ab6e401e5179a4557405c8ebc16952ace57baca0ccff4e2f8ac98ddb" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.577018 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a41a-account-create-update-55jbl" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.638872 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5edf659-0359-4a9c-aca0-d95f3ac8c57e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.638904 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flt94\" (UniqueName: \"kubernetes.io/projected/5bed7ff8-1ca4-47d8-bb12-a13840612182-kube-api-access-flt94\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.638916 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls425\" (UniqueName: \"kubernetes.io/projected/e5edf659-0359-4a9c-aca0-d95f3ac8c57e-kube-api-access-ls425\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.638925 4914 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bed7ff8-1ca4-47d8-bb12-a13840612182-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.740886 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-config\") pod \"af8dbc06-6b83-49c0-9413-56a90165fb97\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.740946 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-httpd-config\") pod \"af8dbc06-6b83-49c0-9413-56a90165fb97\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.741125 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82fj2\" (UniqueName: \"kubernetes.io/projected/af8dbc06-6b83-49c0-9413-56a90165fb97-kube-api-access-82fj2\") pod \"af8dbc06-6b83-49c0-9413-56a90165fb97\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.741142 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-combined-ca-bundle\") pod \"af8dbc06-6b83-49c0-9413-56a90165fb97\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.741163 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-ovndb-tls-certs\") pod \"af8dbc06-6b83-49c0-9413-56a90165fb97\" (UID: \"af8dbc06-6b83-49c0-9413-56a90165fb97\") " Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.750393 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8dbc06-6b83-49c0-9413-56a90165fb97-kube-api-access-82fj2" (OuterVolumeSpecName: "kube-api-access-82fj2") pod "af8dbc06-6b83-49c0-9413-56a90165fb97" (UID: "af8dbc06-6b83-49c0-9413-56a90165fb97"). InnerVolumeSpecName "kube-api-access-82fj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.752991 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "af8dbc06-6b83-49c0-9413-56a90165fb97" (UID: "af8dbc06-6b83-49c0-9413-56a90165fb97"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.782592 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.783268 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7939da09-12d4-4b76-9664-cd12cfd93f72" containerName="glance-log" containerID="cri-o://a7cc55c554c8dffb541010b33783266b1c4829c58b3912a008414f2dd3b2b5f7" gracePeriod=30 Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.783487 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7939da09-12d4-4b76-9664-cd12cfd93f72" containerName="glance-httpd" containerID="cri-o://71ed2d7a3fc976f936122a7214c258388dbe4184523484ced9e276ec97b83734" gracePeriod=30 Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.833315 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af8dbc06-6b83-49c0-9413-56a90165fb97" (UID: "af8dbc06-6b83-49c0-9413-56a90165fb97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.846106 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82fj2\" (UniqueName: \"kubernetes.io/projected/af8dbc06-6b83-49c0-9413-56a90165fb97-kube-api-access-82fj2\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.846446 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.846507 4914 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.846846 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "af8dbc06-6b83-49c0-9413-56a90165fb97" (UID: "af8dbc06-6b83-49c0-9413-56a90165fb97"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.869166 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-config" (OuterVolumeSpecName: "config") pod "af8dbc06-6b83-49c0-9413-56a90165fb97" (UID: "af8dbc06-6b83-49c0-9413-56a90165fb97"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.948912 4914 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:46 crc kubenswrapper[4914]: I0130 21:35:46.948945 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/af8dbc06-6b83-49c0-9413-56a90165fb97-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:47 crc kubenswrapper[4914]: I0130 21:35:47.588680 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9407523-2a66-49a6-98d7-a8e53961e788","Type":"ContainerStarted","Data":"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce"} Jan 30 21:35:47 crc kubenswrapper[4914]: I0130 21:35:47.592053 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5789d46bdd-5kscc" event={"ID":"af8dbc06-6b83-49c0-9413-56a90165fb97","Type":"ContainerDied","Data":"72c8415d9ec3f5921dc4284d76a85723e413ce9cadcc3a2715ef55d4b919fff2"} Jan 30 21:35:47 crc kubenswrapper[4914]: I0130 21:35:47.592098 4914 scope.go:117] "RemoveContainer" containerID="cd9fa6fd0ba56738d050e4726d6578839bcda9c1b3fee5ae13e7dd973ed8100a" Jan 30 21:35:47 crc kubenswrapper[4914]: I0130 21:35:47.592151 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5789d46bdd-5kscc" Jan 30 21:35:47 crc kubenswrapper[4914]: I0130 21:35:47.594328 4914 generic.go:334] "Generic (PLEG): container finished" podID="7939da09-12d4-4b76-9664-cd12cfd93f72" containerID="a7cc55c554c8dffb541010b33783266b1c4829c58b3912a008414f2dd3b2b5f7" exitCode=143 Jan 30 21:35:47 crc kubenswrapper[4914]: I0130 21:35:47.594376 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7939da09-12d4-4b76-9664-cd12cfd93f72","Type":"ContainerDied","Data":"a7cc55c554c8dffb541010b33783266b1c4829c58b3912a008414f2dd3b2b5f7"} Jan 30 21:35:47 crc kubenswrapper[4914]: I0130 21:35:47.619972 4914 scope.go:117] "RemoveContainer" containerID="0195835ed917d263a825d5070d67bbb21d31e2b7db9444936980af6a4d7a30e6" Jan 30 21:35:47 crc kubenswrapper[4914]: I0130 21:35:47.693998 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5789d46bdd-5kscc"] Jan 30 21:35:47 crc kubenswrapper[4914]: I0130 21:35:47.705978 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5789d46bdd-5kscc"] Jan 30 21:35:47 crc kubenswrapper[4914]: I0130 21:35:47.830768 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af8dbc06-6b83-49c0-9413-56a90165fb97" path="/var/lib/kubelet/pods/af8dbc06-6b83-49c0-9413-56a90165fb97/volumes" Jan 30 21:35:48 crc kubenswrapper[4914]: I0130 21:35:48.459935 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:35:48 crc kubenswrapper[4914]: I0130 21:35:48.460347 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9cce3e04-8dbe-4df9-aed0-45303d35e7c4" containerName="glance-httpd" containerID="cri-o://f1458a6042e2d7a23a8761ef656701b576048b24cfb4b46c29578f9bddc48ad2" gracePeriod=30 Jan 30 21:35:48 crc kubenswrapper[4914]: I0130 21:35:48.460671 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9cce3e04-8dbe-4df9-aed0-45303d35e7c4" containerName="glance-log" containerID="cri-o://be8cb6bb42dc103d7c49626741fc2083405b05cb9dd20814432ff3d5970afe4b" gracePeriod=30 Jan 30 21:35:48 crc kubenswrapper[4914]: I0130 21:35:48.589234 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:48 crc kubenswrapper[4914]: I0130 21:35:48.611491 4914 generic.go:334] "Generic (PLEG): container finished" podID="9cce3e04-8dbe-4df9-aed0-45303d35e7c4" containerID="be8cb6bb42dc103d7c49626741fc2083405b05cb9dd20814432ff3d5970afe4b" exitCode=143 Jan 30 21:35:48 crc kubenswrapper[4914]: I0130 21:35:48.611572 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cce3e04-8dbe-4df9-aed0-45303d35e7c4","Type":"ContainerDied","Data":"be8cb6bb42dc103d7c49626741fc2083405b05cb9dd20814432ff3d5970afe4b"} Jan 30 21:35:51 crc kubenswrapper[4914]: I0130 21:35:51.108567 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 21:35:51 crc kubenswrapper[4914]: I0130 21:35:51.648273 4914 generic.go:334] "Generic (PLEG): container finished" podID="7939da09-12d4-4b76-9664-cd12cfd93f72" containerID="71ed2d7a3fc976f936122a7214c258388dbe4184523484ced9e276ec97b83734" exitCode=0 Jan 30 21:35:51 crc kubenswrapper[4914]: I0130 21:35:51.648342 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7939da09-12d4-4b76-9664-cd12cfd93f72","Type":"ContainerDied","Data":"71ed2d7a3fc976f936122a7214c258388dbe4184523484ced9e276ec97b83734"} Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.418395 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h8vzt"] Jan 30 21:35:52 crc kubenswrapper[4914]: E0130 21:35:52.418962 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5edf659-0359-4a9c-aca0-d95f3ac8c57e" containerName="mariadb-database-create" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.418974 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5edf659-0359-4a9c-aca0-d95f3ac8c57e" containerName="mariadb-database-create" Jan 30 21:35:52 crc kubenswrapper[4914]: E0130 21:35:52.418984 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8dbc06-6b83-49c0-9413-56a90165fb97" containerName="neutron-httpd" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.418990 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8dbc06-6b83-49c0-9413-56a90165fb97" containerName="neutron-httpd" Jan 30 21:35:52 crc kubenswrapper[4914]: E0130 21:35:52.418996 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a7e1e76-75e3-4193-b755-1b044debf71f" containerName="mariadb-account-create-update" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419002 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a7e1e76-75e3-4193-b755-1b044debf71f" containerName="mariadb-account-create-update" Jan 30 21:35:52 crc kubenswrapper[4914]: E0130 21:35:52.419011 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2dbf656-0687-4199-8101-c25fb82801e8" containerName="mariadb-account-create-update" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419018 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2dbf656-0687-4199-8101-c25fb82801e8" containerName="mariadb-account-create-update" Jan 30 21:35:52 crc kubenswrapper[4914]: E0130 21:35:52.419031 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8dbc06-6b83-49c0-9413-56a90165fb97" containerName="neutron-api" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419037 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8dbc06-6b83-49c0-9413-56a90165fb97" containerName="neutron-api" Jan 30 21:35:52 crc kubenswrapper[4914]: E0130 21:35:52.419047 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db5c9d5a-21ac-4c8c-a108-c6752014ec58" containerName="mariadb-database-create" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419052 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="db5c9d5a-21ac-4c8c-a108-c6752014ec58" containerName="mariadb-database-create" Jan 30 21:35:52 crc kubenswrapper[4914]: E0130 21:35:52.419069 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bed7ff8-1ca4-47d8-bb12-a13840612182" containerName="mariadb-account-create-update" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419074 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bed7ff8-1ca4-47d8-bb12-a13840612182" containerName="mariadb-account-create-update" Jan 30 21:35:52 crc kubenswrapper[4914]: E0130 21:35:52.419081 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e7c831-3c33-429a-ac8a-e7768226c344" containerName="mariadb-database-create" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419086 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e7c831-3c33-429a-ac8a-e7768226c344" containerName="mariadb-database-create" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419258 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="db5c9d5a-21ac-4c8c-a108-c6752014ec58" containerName="mariadb-database-create" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419267 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bed7ff8-1ca4-47d8-bb12-a13840612182" containerName="mariadb-account-create-update" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419277 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a7e1e76-75e3-4193-b755-1b044debf71f" containerName="mariadb-account-create-update" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419286 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="40e7c831-3c33-429a-ac8a-e7768226c344" containerName="mariadb-database-create" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419298 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2dbf656-0687-4199-8101-c25fb82801e8" containerName="mariadb-account-create-update" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419310 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5edf659-0359-4a9c-aca0-d95f3ac8c57e" containerName="mariadb-database-create" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419320 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8dbc06-6b83-49c0-9413-56a90165fb97" containerName="neutron-api" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419326 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8dbc06-6b83-49c0-9413-56a90165fb97" containerName="neutron-httpd" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.419948 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.423808 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kmd89" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.424839 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.425189 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.426960 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h8vzt"] Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.485437 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-scripts\") pod \"nova-cell0-conductor-db-sync-h8vzt\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.485816 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgsnx\" (UniqueName: \"kubernetes.io/projected/39e5479a-5cbf-479b-b3b9-3af2a3424492-kube-api-access-sgsnx\") pod \"nova-cell0-conductor-db-sync-h8vzt\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.485896 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h8vzt\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.485954 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-config-data\") pod \"nova-cell0-conductor-db-sync-h8vzt\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.566648 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.589021 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-scripts\") pod \"nova-cell0-conductor-db-sync-h8vzt\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.589131 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgsnx\" (UniqueName: \"kubernetes.io/projected/39e5479a-5cbf-479b-b3b9-3af2a3424492-kube-api-access-sgsnx\") pod \"nova-cell0-conductor-db-sync-h8vzt\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.589200 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h8vzt\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.589239 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-config-data\") pod \"nova-cell0-conductor-db-sync-h8vzt\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.612986 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h8vzt\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.615130 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-scripts\") pod \"nova-cell0-conductor-db-sync-h8vzt\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.671775 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-config-data\") pod \"nova-cell0-conductor-db-sync-h8vzt\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.671830 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgsnx\" (UniqueName: \"kubernetes.io/projected/39e5479a-5cbf-479b-b3b9-3af2a3424492-kube-api-access-sgsnx\") pod \"nova-cell0-conductor-db-sync-h8vzt\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.702944 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-public-tls-certs\") pod \"7939da09-12d4-4b76-9664-cd12cfd93f72\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.702982 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-config-data\") pod \"7939da09-12d4-4b76-9664-cd12cfd93f72\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.703022 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7939da09-12d4-4b76-9664-cd12cfd93f72-httpd-run\") pod \"7939da09-12d4-4b76-9664-cd12cfd93f72\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.703038 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7939da09-12d4-4b76-9664-cd12cfd93f72-logs\") pod \"7939da09-12d4-4b76-9664-cd12cfd93f72\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.703198 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") pod \"7939da09-12d4-4b76-9664-cd12cfd93f72\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.703268 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-combined-ca-bundle\") pod \"7939da09-12d4-4b76-9664-cd12cfd93f72\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.703294 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwq27\" (UniqueName: \"kubernetes.io/projected/7939da09-12d4-4b76-9664-cd12cfd93f72-kube-api-access-qwq27\") pod \"7939da09-12d4-4b76-9664-cd12cfd93f72\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.703321 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-scripts\") pod \"7939da09-12d4-4b76-9664-cd12cfd93f72\" (UID: \"7939da09-12d4-4b76-9664-cd12cfd93f72\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.706979 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-scripts" (OuterVolumeSpecName: "scripts") pod "7939da09-12d4-4b76-9664-cd12cfd93f72" (UID: "7939da09-12d4-4b76-9664-cd12cfd93f72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.708204 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7939da09-12d4-4b76-9664-cd12cfd93f72-logs" (OuterVolumeSpecName: "logs") pod "7939da09-12d4-4b76-9664-cd12cfd93f72" (UID: "7939da09-12d4-4b76-9664-cd12cfd93f72"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.711189 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7939da09-12d4-4b76-9664-cd12cfd93f72-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7939da09-12d4-4b76-9664-cd12cfd93f72" (UID: "7939da09-12d4-4b76-9664-cd12cfd93f72"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.721681 4914 generic.go:334] "Generic (PLEG): container finished" podID="9cce3e04-8dbe-4df9-aed0-45303d35e7c4" containerID="f1458a6042e2d7a23a8761ef656701b576048b24cfb4b46c29578f9bddc48ad2" exitCode=0 Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.721763 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cce3e04-8dbe-4df9-aed0-45303d35e7c4","Type":"ContainerDied","Data":"f1458a6042e2d7a23a8761ef656701b576048b24cfb4b46c29578f9bddc48ad2"} Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.729997 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7939da09-12d4-4b76-9664-cd12cfd93f72-kube-api-access-qwq27" (OuterVolumeSpecName: "kube-api-access-qwq27") pod "7939da09-12d4-4b76-9664-cd12cfd93f72" (UID: "7939da09-12d4-4b76-9664-cd12cfd93f72"). InnerVolumeSpecName "kube-api-access-qwq27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.734946 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9407523-2a66-49a6-98d7-a8e53961e788","Type":"ContainerStarted","Data":"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca"} Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.735098 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="ceilometer-central-agent" containerID="cri-o://4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957" gracePeriod=30 Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.735169 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.735488 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="proxy-httpd" containerID="cri-o://c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca" gracePeriod=30 Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.735532 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="sg-core" containerID="cri-o://9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce" gracePeriod=30 Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.735564 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="ceilometer-notification-agent" containerID="cri-o://e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a" gracePeriod=30 Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.746570 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.756869 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7939da09-12d4-4b76-9664-cd12cfd93f72","Type":"ContainerDied","Data":"744d0fe8ea320619e1016487d5236e70a95b7b1d950244af7eed866659e0ad61"} Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.756913 4914 scope.go:117] "RemoveContainer" containerID="71ed2d7a3fc976f936122a7214c258388dbe4184523484ced9e276ec97b83734" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.757058 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.770917 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.159106742 podStartE2EDuration="12.770897132s" podCreationTimestamp="2026-01-30 21:35:40 +0000 UTC" firstStartedPulling="2026-01-30 21:35:41.699833833 +0000 UTC m=+1275.138470594" lastFinishedPulling="2026-01-30 21:35:51.311624203 +0000 UTC m=+1284.750260984" observedRunningTime="2026-01-30 21:35:52.758102095 +0000 UTC m=+1286.196738856" watchObservedRunningTime="2026-01-30 21:35:52.770897132 +0000 UTC m=+1286.209533893" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.771920 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7939da09-12d4-4b76-9664-cd12cfd93f72" (UID: "7939da09-12d4-4b76-9664-cd12cfd93f72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.805572 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.806250 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwq27\" (UniqueName: \"kubernetes.io/projected/7939da09-12d4-4b76-9664-cd12cfd93f72-kube-api-access-qwq27\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.806262 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.806270 4914 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7939da09-12d4-4b76-9664-cd12cfd93f72-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.806278 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7939da09-12d4-4b76-9664-cd12cfd93f72-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.813382 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.820350 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23" (OuterVolumeSpecName: "glance") pod "7939da09-12d4-4b76-9664-cd12cfd93f72" (UID: "7939da09-12d4-4b76-9664-cd12cfd93f72"). InnerVolumeSpecName "pvc-581e81f1-e0f1-495b-b12a-80feb1423c23". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.840454 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7939da09-12d4-4b76-9664-cd12cfd93f72" (UID: "7939da09-12d4-4b76-9664-cd12cfd93f72"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.853094 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-config-data" (OuterVolumeSpecName: "config-data") pod "7939da09-12d4-4b76-9664-cd12cfd93f72" (UID: "7939da09-12d4-4b76-9664-cd12cfd93f72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.893861 4914 scope.go:117] "RemoveContainer" containerID="a7cc55c554c8dffb541010b33783266b1c4829c58b3912a008414f2dd3b2b5f7" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.908060 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-combined-ca-bundle\") pod \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.908414 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") pod \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.908676 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-httpd-run\") pod \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.908920 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfwdc\" (UniqueName: \"kubernetes.io/projected/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-kube-api-access-jfwdc\") pod \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.908995 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-config-data\") pod \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.909094 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-logs\") pod \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.909230 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-internal-tls-certs\") pod \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.909321 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-scripts\") pod \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\" (UID: \"9cce3e04-8dbe-4df9-aed0-45303d35e7c4\") " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.909835 4914 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.910036 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7939da09-12d4-4b76-9664-cd12cfd93f72-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.910110 4914 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") on node \"crc\" " Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.909337 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9cce3e04-8dbe-4df9-aed0-45303d35e7c4" (UID: "9cce3e04-8dbe-4df9-aed0-45303d35e7c4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.909570 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-logs" (OuterVolumeSpecName: "logs") pod "9cce3e04-8dbe-4df9-aed0-45303d35e7c4" (UID: "9cce3e04-8dbe-4df9-aed0-45303d35e7c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.927827 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-scripts" (OuterVolumeSpecName: "scripts") pod "9cce3e04-8dbe-4df9-aed0-45303d35e7c4" (UID: "9cce3e04-8dbe-4df9-aed0-45303d35e7c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.949020 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-kube-api-access-jfwdc" (OuterVolumeSpecName: "kube-api-access-jfwdc") pod "9cce3e04-8dbe-4df9-aed0-45303d35e7c4" (UID: "9cce3e04-8dbe-4df9-aed0-45303d35e7c4"). InnerVolumeSpecName "kube-api-access-jfwdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.956019 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1" (OuterVolumeSpecName: "glance") pod "9cce3e04-8dbe-4df9-aed0-45303d35e7c4" (UID: "9cce3e04-8dbe-4df9-aed0-45303d35e7c4"). InnerVolumeSpecName "pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.957647 4914 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.957845 4914 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-581e81f1-e0f1-495b-b12a-80feb1423c23" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23") on node "crc" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.989848 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9cce3e04-8dbe-4df9-aed0-45303d35e7c4" (UID: "9cce3e04-8dbe-4df9-aed0-45303d35e7c4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:52 crc kubenswrapper[4914]: I0130 21:35:52.997807 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cce3e04-8dbe-4df9-aed0-45303d35e7c4" (UID: "9cce3e04-8dbe-4df9-aed0-45303d35e7c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.013908 4914 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.013935 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfwdc\" (UniqueName: \"kubernetes.io/projected/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-kube-api-access-jfwdc\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.013946 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.013955 4914 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.013964 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.013973 4914 reconciler_common.go:293] "Volume detached for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.013982 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.014005 4914 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") on node \"crc\" " Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.025822 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-config-data" (OuterVolumeSpecName: "config-data") pod "9cce3e04-8dbe-4df9-aed0-45303d35e7c4" (UID: "9cce3e04-8dbe-4df9-aed0-45303d35e7c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.045445 4914 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.045577 4914 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1") on node "crc" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.116933 4914 reconciler_common.go:293] "Volume detached for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.117335 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cce3e04-8dbe-4df9-aed0-45303d35e7c4-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.117404 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.129663 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.146554 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:35:53 crc kubenswrapper[4914]: E0130 21:35:53.147127 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cce3e04-8dbe-4df9-aed0-45303d35e7c4" containerName="glance-httpd" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.147147 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cce3e04-8dbe-4df9-aed0-45303d35e7c4" containerName="glance-httpd" Jan 30 21:35:53 crc kubenswrapper[4914]: E0130 21:35:53.147173 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7939da09-12d4-4b76-9664-cd12cfd93f72" containerName="glance-log" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.147182 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="7939da09-12d4-4b76-9664-cd12cfd93f72" containerName="glance-log" Jan 30 21:35:53 crc kubenswrapper[4914]: E0130 21:35:53.147203 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cce3e04-8dbe-4df9-aed0-45303d35e7c4" containerName="glance-log" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.147214 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cce3e04-8dbe-4df9-aed0-45303d35e7c4" containerName="glance-log" Jan 30 21:35:53 crc kubenswrapper[4914]: E0130 21:35:53.147235 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7939da09-12d4-4b76-9664-cd12cfd93f72" containerName="glance-httpd" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.147245 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="7939da09-12d4-4b76-9664-cd12cfd93f72" containerName="glance-httpd" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.147504 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="7939da09-12d4-4b76-9664-cd12cfd93f72" containerName="glance-log" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.147522 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="7939da09-12d4-4b76-9664-cd12cfd93f72" containerName="glance-httpd" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.147537 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cce3e04-8dbe-4df9-aed0-45303d35e7c4" containerName="glance-log" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.147556 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cce3e04-8dbe-4df9-aed0-45303d35e7c4" containerName="glance-httpd" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.164314 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.164423 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.170135 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.170178 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.219039 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc35b1ae-deb3-425f-86a2-530461b4a6f1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.219078 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.219095 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fns5w\" (UniqueName: \"kubernetes.io/projected/fc35b1ae-deb3-425f-86a2-530461b4a6f1-kube-api-access-fns5w\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.219131 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc35b1ae-deb3-425f-86a2-530461b4a6f1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.219159 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc35b1ae-deb3-425f-86a2-530461b4a6f1-logs\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.219218 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc35b1ae-deb3-425f-86a2-530461b4a6f1-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.219245 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc35b1ae-deb3-425f-86a2-530461b4a6f1-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.219313 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc35b1ae-deb3-425f-86a2-530461b4a6f1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.258912 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.331487 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc35b1ae-deb3-425f-86a2-530461b4a6f1-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.331545 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc35b1ae-deb3-425f-86a2-530461b4a6f1-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.331721 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc35b1ae-deb3-425f-86a2-530461b4a6f1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.331790 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc35b1ae-deb3-425f-86a2-530461b4a6f1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.331818 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.331838 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fns5w\" (UniqueName: \"kubernetes.io/projected/fc35b1ae-deb3-425f-86a2-530461b4a6f1-kube-api-access-fns5w\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.331883 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc35b1ae-deb3-425f-86a2-530461b4a6f1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.331941 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc35b1ae-deb3-425f-86a2-530461b4a6f1-logs\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.332337 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc35b1ae-deb3-425f-86a2-530461b4a6f1-logs\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.332764 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc35b1ae-deb3-425f-86a2-530461b4a6f1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.336779 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.336814 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b72a388fd97a6544e59fcf27848209fccbec8612f776f09e5be6189768241719/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.336864 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc35b1ae-deb3-425f-86a2-530461b4a6f1-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.337692 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc35b1ae-deb3-425f-86a2-530461b4a6f1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.339390 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc35b1ae-deb3-425f-86a2-530461b4a6f1-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.339403 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc35b1ae-deb3-425f-86a2-530461b4a6f1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.351587 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fns5w\" (UniqueName: \"kubernetes.io/projected/fc35b1ae-deb3-425f-86a2-530461b4a6f1-kube-api-access-fns5w\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.395533 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-581e81f1-e0f1-495b-b12a-80feb1423c23\") pod \"glance-default-external-api-0\" (UID: \"fc35b1ae-deb3-425f-86a2-530461b4a6f1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.526207 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h8vzt"] Jan 30 21:35:53 crc kubenswrapper[4914]: W0130 21:35:53.531565 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39e5479a_5cbf_479b_b3b9_3af2a3424492.slice/crio-ed57e5c786900b84549119a3925a4f9c7d6b8c4bf040dcbe0fb5f21012c7e9a2 WatchSource:0}: Error finding container ed57e5c786900b84549119a3925a4f9c7d6b8c4bf040dcbe0fb5f21012c7e9a2: Status 404 returned error can't find the container with id ed57e5c786900b84549119a3925a4f9c7d6b8c4bf040dcbe0fb5f21012c7e9a2 Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.552362 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.711539 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.778661 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h8vzt" event={"ID":"39e5479a-5cbf-479b-b3b9-3af2a3424492","Type":"ContainerStarted","Data":"ed57e5c786900b84549119a3925a4f9c7d6b8c4bf040dcbe0fb5f21012c7e9a2"} Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.780354 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cce3e04-8dbe-4df9-aed0-45303d35e7c4","Type":"ContainerDied","Data":"20b47533104844fa5ef3d36cb485fdaf4f40a6e1de9efadd6f67542d5b4e13a8"} Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.780384 4914 scope.go:117] "RemoveContainer" containerID="f1458a6042e2d7a23a8761ef656701b576048b24cfb4b46c29578f9bddc48ad2" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.780469 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.796739 4914 generic.go:334] "Generic (PLEG): container finished" podID="f9407523-2a66-49a6-98d7-a8e53961e788" containerID="c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca" exitCode=0 Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.796762 4914 generic.go:334] "Generic (PLEG): container finished" podID="f9407523-2a66-49a6-98d7-a8e53961e788" containerID="9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce" exitCode=2 Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.796812 4914 generic.go:334] "Generic (PLEG): container finished" podID="f9407523-2a66-49a6-98d7-a8e53961e788" containerID="e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a" exitCode=0 Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.796824 4914 generic.go:334] "Generic (PLEG): container finished" podID="f9407523-2a66-49a6-98d7-a8e53961e788" containerID="4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957" exitCode=0 Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.796777 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.796792 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9407523-2a66-49a6-98d7-a8e53961e788","Type":"ContainerDied","Data":"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca"} Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.796904 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9407523-2a66-49a6-98d7-a8e53961e788","Type":"ContainerDied","Data":"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce"} Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.796948 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9407523-2a66-49a6-98d7-a8e53961e788","Type":"ContainerDied","Data":"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a"} Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.796960 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9407523-2a66-49a6-98d7-a8e53961e788","Type":"ContainerDied","Data":"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957"} Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.796969 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9407523-2a66-49a6-98d7-a8e53961e788","Type":"ContainerDied","Data":"17e32a51aaa9c4b1fdbb91a8061d4a9bdd32912ea282a567089812a6881cb08b"} Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.837365 4914 scope.go:117] "RemoveContainer" containerID="be8cb6bb42dc103d7c49626741fc2083405b05cb9dd20814432ff3d5970afe4b" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.853603 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9407523-2a66-49a6-98d7-a8e53961e788-log-httpd\") pod \"f9407523-2a66-49a6-98d7-a8e53961e788\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.853646 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-config-data\") pod \"f9407523-2a66-49a6-98d7-a8e53961e788\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.853685 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-scripts\") pod \"f9407523-2a66-49a6-98d7-a8e53961e788\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.853751 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-sg-core-conf-yaml\") pod \"f9407523-2a66-49a6-98d7-a8e53961e788\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.853786 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9407523-2a66-49a6-98d7-a8e53961e788-run-httpd\") pod \"f9407523-2a66-49a6-98d7-a8e53961e788\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.853841 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-ceilometer-tls-certs\") pod \"f9407523-2a66-49a6-98d7-a8e53961e788\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.853908 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-combined-ca-bundle\") pod \"f9407523-2a66-49a6-98d7-a8e53961e788\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.853944 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z48fw\" (UniqueName: \"kubernetes.io/projected/f9407523-2a66-49a6-98d7-a8e53961e788-kube-api-access-z48fw\") pod \"f9407523-2a66-49a6-98d7-a8e53961e788\" (UID: \"f9407523-2a66-49a6-98d7-a8e53961e788\") " Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.858776 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9407523-2a66-49a6-98d7-a8e53961e788-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f9407523-2a66-49a6-98d7-a8e53961e788" (UID: "f9407523-2a66-49a6-98d7-a8e53961e788"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.860995 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9407523-2a66-49a6-98d7-a8e53961e788-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f9407523-2a66-49a6-98d7-a8e53961e788" (UID: "f9407523-2a66-49a6-98d7-a8e53961e788"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.862771 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7939da09-12d4-4b76-9664-cd12cfd93f72" path="/var/lib/kubelet/pods/7939da09-12d4-4b76-9664-cd12cfd93f72/volumes" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.867883 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9407523-2a66-49a6-98d7-a8e53961e788-kube-api-access-z48fw" (OuterVolumeSpecName: "kube-api-access-z48fw") pod "f9407523-2a66-49a6-98d7-a8e53961e788" (UID: "f9407523-2a66-49a6-98d7-a8e53961e788"). InnerVolumeSpecName "kube-api-access-z48fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.894340 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-scripts" (OuterVolumeSpecName: "scripts") pod "f9407523-2a66-49a6-98d7-a8e53961e788" (UID: "f9407523-2a66-49a6-98d7-a8e53961e788"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.953818 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f9407523-2a66-49a6-98d7-a8e53961e788" (UID: "f9407523-2a66-49a6-98d7-a8e53961e788"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.956222 4914 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9407523-2a66-49a6-98d7-a8e53961e788-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.956248 4914 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.956259 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z48fw\" (UniqueName: \"kubernetes.io/projected/f9407523-2a66-49a6-98d7-a8e53961e788-kube-api-access-z48fw\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.956269 4914 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9407523-2a66-49a6-98d7-a8e53961e788-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.956276 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.963791 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.963821 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.963835 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:35:53 crc kubenswrapper[4914]: E0130 21:35:53.964173 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="proxy-httpd" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.964188 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="proxy-httpd" Jan 30 21:35:53 crc kubenswrapper[4914]: E0130 21:35:53.964199 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="ceilometer-central-agent" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.964205 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="ceilometer-central-agent" Jan 30 21:35:53 crc kubenswrapper[4914]: E0130 21:35:53.964221 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="sg-core" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.964228 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="sg-core" Jan 30 21:35:53 crc kubenswrapper[4914]: E0130 21:35:53.964236 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="ceilometer-notification-agent" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.964241 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="ceilometer-notification-agent" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.964432 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="ceilometer-notification-agent" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.964446 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="ceilometer-central-agent" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.964456 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="proxy-httpd" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.964472 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" containerName="sg-core" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.965489 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.965617 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.978597 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f9407523-2a66-49a6-98d7-a8e53961e788" (UID: "f9407523-2a66-49a6-98d7-a8e53961e788"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.978734 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.978909 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 21:35:53 crc kubenswrapper[4914]: I0130 21:35:53.996070 4914 scope.go:117] "RemoveContainer" containerID="c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.036562 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-config-data" (OuterVolumeSpecName: "config-data") pod "f9407523-2a66-49a6-98d7-a8e53961e788" (UID: "f9407523-2a66-49a6-98d7-a8e53961e788"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.054138 4914 scope.go:117] "RemoveContainer" containerID="9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.062245 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.065811 4914 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.070330 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9407523-2a66-49a6-98d7-a8e53961e788" (UID: "f9407523-2a66-49a6-98d7-a8e53961e788"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.095447 4914 scope.go:117] "RemoveContainer" containerID="e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.132247 4914 scope.go:117] "RemoveContainer" containerID="4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.157556 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.170902 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.173501 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00a00054-eb1e-492b-854d-4ea3396983ef-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.173601 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a00054-eb1e-492b-854d-4ea3396983ef-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.173633 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a00054-eb1e-492b-854d-4ea3396983ef-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.173666 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.173699 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00a00054-eb1e-492b-854d-4ea3396983ef-scripts\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.173775 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a00054-eb1e-492b-854d-4ea3396983ef-config-data\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.173813 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00a00054-eb1e-492b-854d-4ea3396983ef-logs\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.173844 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psmjz\" (UniqueName: \"kubernetes.io/projected/00a00054-eb1e-492b-854d-4ea3396983ef-kube-api-access-psmjz\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.173915 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9407523-2a66-49a6-98d7-a8e53961e788-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.180681 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.183014 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.188466 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.189357 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.190168 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.194870 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.202340 4914 scope.go:117] "RemoveContainer" containerID="c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca" Jan 30 21:35:54 crc kubenswrapper[4914]: E0130 21:35:54.207242 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca\": container with ID starting with c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca not found: ID does not exist" containerID="c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.207282 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca"} err="failed to get container status \"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca\": rpc error: code = NotFound desc = could not find container \"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca\": container with ID starting with c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.207309 4914 scope.go:117] "RemoveContainer" containerID="9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce" Jan 30 21:35:54 crc kubenswrapper[4914]: E0130 21:35:54.209025 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce\": container with ID starting with 9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce not found: ID does not exist" containerID="9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.209133 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce"} err="failed to get container status \"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce\": rpc error: code = NotFound desc = could not find container \"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce\": container with ID starting with 9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.209177 4914 scope.go:117] "RemoveContainer" containerID="e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a" Jan 30 21:35:54 crc kubenswrapper[4914]: E0130 21:35:54.209932 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a\": container with ID starting with e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a not found: ID does not exist" containerID="e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.209958 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a"} err="failed to get container status \"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a\": rpc error: code = NotFound desc = could not find container \"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a\": container with ID starting with e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.210014 4914 scope.go:117] "RemoveContainer" containerID="4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957" Jan 30 21:35:54 crc kubenswrapper[4914]: E0130 21:35:54.210854 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957\": container with ID starting with 4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957 not found: ID does not exist" containerID="4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.210992 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957"} err="failed to get container status \"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957\": rpc error: code = NotFound desc = could not find container \"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957\": container with ID starting with 4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957 not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.211106 4914 scope.go:117] "RemoveContainer" containerID="c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.212028 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca"} err="failed to get container status \"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca\": rpc error: code = NotFound desc = could not find container \"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca\": container with ID starting with c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.212063 4914 scope.go:117] "RemoveContainer" containerID="9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.212547 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce"} err="failed to get container status \"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce\": rpc error: code = NotFound desc = could not find container \"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce\": container with ID starting with 9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.212574 4914 scope.go:117] "RemoveContainer" containerID="e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.213418 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a"} err="failed to get container status \"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a\": rpc error: code = NotFound desc = could not find container \"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a\": container with ID starting with e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.213444 4914 scope.go:117] "RemoveContainer" containerID="4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.213746 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957"} err="failed to get container status \"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957\": rpc error: code = NotFound desc = could not find container \"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957\": container with ID starting with 4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957 not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.213838 4914 scope.go:117] "RemoveContainer" containerID="c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.214868 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca"} err="failed to get container status \"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca\": rpc error: code = NotFound desc = could not find container \"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca\": container with ID starting with c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.214967 4914 scope.go:117] "RemoveContainer" containerID="9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.215422 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce"} err="failed to get container status \"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce\": rpc error: code = NotFound desc = could not find container \"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce\": container with ID starting with 9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.215476 4914 scope.go:117] "RemoveContainer" containerID="e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.217963 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a"} err="failed to get container status \"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a\": rpc error: code = NotFound desc = could not find container \"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a\": container with ID starting with e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.217986 4914 scope.go:117] "RemoveContainer" containerID="4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.218371 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957"} err="failed to get container status \"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957\": rpc error: code = NotFound desc = could not find container \"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957\": container with ID starting with 4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957 not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.218388 4914 scope.go:117] "RemoveContainer" containerID="c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.218831 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca"} err="failed to get container status \"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca\": rpc error: code = NotFound desc = could not find container \"c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca\": container with ID starting with c54888f048e0c7cce7a53566856e494bd8fb72e79808bc6ff838535628bc63ca not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.218847 4914 scope.go:117] "RemoveContainer" containerID="9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.219533 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce"} err="failed to get container status \"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce\": rpc error: code = NotFound desc = could not find container \"9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce\": container with ID starting with 9971efc64ec9e529fd94923f42b2e92dae8fae4af2af75a4eabfffbd851a3fce not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.219552 4914 scope.go:117] "RemoveContainer" containerID="e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.219762 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a"} err="failed to get container status \"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a\": rpc error: code = NotFound desc = could not find container \"e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a\": container with ID starting with e31d661a6c4d311f14edfef61a44a03ba66c0c18487558ab9aaa83ed05b0683a not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.219777 4914 scope.go:117] "RemoveContainer" containerID="4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.220154 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957"} err="failed to get container status \"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957\": rpc error: code = NotFound desc = could not find container \"4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957\": container with ID starting with 4bb16c1179828ea2bd0bec0d3340e1680b7463f6342625e42e0a0f57a579c957 not found: ID does not exist" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.275818 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psmjz\" (UniqueName: \"kubernetes.io/projected/00a00054-eb1e-492b-854d-4ea3396983ef-kube-api-access-psmjz\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.276001 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00a00054-eb1e-492b-854d-4ea3396983ef-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.276138 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a00054-eb1e-492b-854d-4ea3396983ef-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.276229 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a00054-eb1e-492b-854d-4ea3396983ef-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.276314 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.276403 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00a00054-eb1e-492b-854d-4ea3396983ef-scripts\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.276509 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a00054-eb1e-492b-854d-4ea3396983ef-config-data\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.276595 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00a00054-eb1e-492b-854d-4ea3396983ef-logs\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.276774 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00a00054-eb1e-492b-854d-4ea3396983ef-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.277101 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00a00054-eb1e-492b-854d-4ea3396983ef-logs\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.282171 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.282623 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/66c50567016faa78360e7f45b700987189e7fa2a9601532760fae56b995ba54f/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.283513 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00a00054-eb1e-492b-854d-4ea3396983ef-scripts\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.284163 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a00054-eb1e-492b-854d-4ea3396983ef-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.284833 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a00054-eb1e-492b-854d-4ea3396983ef-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.288221 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.291797 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a00054-eb1e-492b-854d-4ea3396983ef-config-data\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.293759 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psmjz\" (UniqueName: \"kubernetes.io/projected/00a00054-eb1e-492b-854d-4ea3396983ef-kube-api-access-psmjz\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.331338 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-062e92ed-03b3-4a42-91bc-4e66f0b1aaf1\") pod \"glance-default-internal-api-0\" (UID: \"00a00054-eb1e-492b-854d-4ea3396983ef\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.383502 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cd0ddef-4517-43be-843d-e578319733d0-run-httpd\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.383638 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cd0ddef-4517-43be-843d-e578319733d0-log-httpd\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.383847 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.383931 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-config-data\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.383989 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-scripts\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.384056 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.384741 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5sxr\" (UniqueName: \"kubernetes.io/projected/2cd0ddef-4517-43be-843d-e578319733d0-kube-api-access-z5sxr\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.384794 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.486251 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.486311 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-config-data\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.486358 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-scripts\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.486860 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.486953 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5sxr\" (UniqueName: \"kubernetes.io/projected/2cd0ddef-4517-43be-843d-e578319733d0-kube-api-access-z5sxr\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.487011 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.487071 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cd0ddef-4517-43be-843d-e578319733d0-run-httpd\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.487118 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cd0ddef-4517-43be-843d-e578319733d0-log-httpd\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.487514 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cd0ddef-4517-43be-843d-e578319733d0-log-httpd\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.490193 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cd0ddef-4517-43be-843d-e578319733d0-run-httpd\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.491231 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-scripts\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.495891 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.496402 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.497268 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.498860 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-config-data\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.508271 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5sxr\" (UniqueName: \"kubernetes.io/projected/2cd0ddef-4517-43be-843d-e578319733d0-kube-api-access-z5sxr\") pod \"ceilometer-0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.514492 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.614676 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:35:54 crc kubenswrapper[4914]: I0130 21:35:54.825192 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc35b1ae-deb3-425f-86a2-530461b4a6f1","Type":"ContainerStarted","Data":"394d3b7b2673b190c24238f70b5849433300811ed5459d186691b3b30c8abba8"} Jan 30 21:35:55 crc kubenswrapper[4914]: I0130 21:35:55.057069 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:35:55 crc kubenswrapper[4914]: I0130 21:35:55.291543 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:35:55 crc kubenswrapper[4914]: W0130 21:35:55.292800 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00a00054_eb1e_492b_854d_4ea3396983ef.slice/crio-2179af14d450139b4f0145009200a51e2360a329d3e07d29d0d570e330dc0f63 WatchSource:0}: Error finding container 2179af14d450139b4f0145009200a51e2360a329d3e07d29d0d570e330dc0f63: Status 404 returned error can't find the container with id 2179af14d450139b4f0145009200a51e2360a329d3e07d29d0d570e330dc0f63 Jan 30 21:35:55 crc kubenswrapper[4914]: I0130 21:35:55.834928 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cce3e04-8dbe-4df9-aed0-45303d35e7c4" path="/var/lib/kubelet/pods/9cce3e04-8dbe-4df9-aed0-45303d35e7c4/volumes" Jan 30 21:35:55 crc kubenswrapper[4914]: I0130 21:35:55.836120 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9407523-2a66-49a6-98d7-a8e53961e788" path="/var/lib/kubelet/pods/f9407523-2a66-49a6-98d7-a8e53961e788/volumes" Jan 30 21:35:55 crc kubenswrapper[4914]: I0130 21:35:55.839553 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2cd0ddef-4517-43be-843d-e578319733d0","Type":"ContainerStarted","Data":"cc3bcf675a0bdb847d55aa30d60da3c432500fbda9920f0ecd9f0ad02d148426"} Jan 30 21:35:55 crc kubenswrapper[4914]: I0130 21:35:55.841652 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00a00054-eb1e-492b-854d-4ea3396983ef","Type":"ContainerStarted","Data":"2179af14d450139b4f0145009200a51e2360a329d3e07d29d0d570e330dc0f63"} Jan 30 21:35:55 crc kubenswrapper[4914]: I0130 21:35:55.844805 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc35b1ae-deb3-425f-86a2-530461b4a6f1","Type":"ContainerStarted","Data":"c3b1360508ebab41a3121050949bd3f490a1e59c782f2f8ee76e93bd3aaaad70"} Jan 30 21:35:56 crc kubenswrapper[4914]: I0130 21:35:56.871888 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00a00054-eb1e-492b-854d-4ea3396983ef","Type":"ContainerStarted","Data":"f14b3dfa7a94c81d8290d0cce7473c280b1d9432a1395d753aedaf65a483a612"} Jan 30 21:35:56 crc kubenswrapper[4914]: I0130 21:35:56.872515 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"00a00054-eb1e-492b-854d-4ea3396983ef","Type":"ContainerStarted","Data":"d8e2a970a21e386c7e5e6ded172adb2800cb5f104b0788670e4069fcd08ebf8b"} Jan 30 21:35:56 crc kubenswrapper[4914]: I0130 21:35:56.878965 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc35b1ae-deb3-425f-86a2-530461b4a6f1","Type":"ContainerStarted","Data":"cd7d2683c7b57715200b81ae216a67cc38ff4f997143817b74793c7d9b41171b"} Jan 30 21:35:56 crc kubenswrapper[4914]: I0130 21:35:56.886368 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2cd0ddef-4517-43be-843d-e578319733d0","Type":"ContainerStarted","Data":"71acee96adf3a1a43380a6c77086f652d72a339d33b4748f21a9fcda7b900d3c"} Jan 30 21:35:56 crc kubenswrapper[4914]: I0130 21:35:56.908244 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.9082275859999998 podStartE2EDuration="3.908227586s" podCreationTimestamp="2026-01-30 21:35:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:56.896968706 +0000 UTC m=+1290.335605467" watchObservedRunningTime="2026-01-30 21:35:56.908227586 +0000 UTC m=+1290.346864347" Jan 30 21:35:57 crc kubenswrapper[4914]: I0130 21:35:57.851809 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.851792116 podStartE2EDuration="4.851792116s" podCreationTimestamp="2026-01-30 21:35:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:35:56.93092091 +0000 UTC m=+1290.369557671" watchObservedRunningTime="2026-01-30 21:35:57.851792116 +0000 UTC m=+1291.290428877" Jan 30 21:35:57 crc kubenswrapper[4914]: I0130 21:35:57.901354 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2cd0ddef-4517-43be-843d-e578319733d0","Type":"ContainerStarted","Data":"0e1059693a39abe2197c88da5e4b7fef209bcd46abe40c5601321ba31280fd84"} Jan 30 21:36:01 crc kubenswrapper[4914]: I0130 21:36:01.465673 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Jan 30 21:36:02 crc kubenswrapper[4914]: I0130 21:36:02.960584 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h8vzt" event={"ID":"39e5479a-5cbf-479b-b3b9-3af2a3424492","Type":"ContainerStarted","Data":"0de72de9e1c09ecfa7bb5ad9c1b252f5a872040b65cefa8c01285030bee47686"} Jan 30 21:36:02 crc kubenswrapper[4914]: I0130 21:36:02.966569 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2cd0ddef-4517-43be-843d-e578319733d0","Type":"ContainerStarted","Data":"fc93cb161f2087f47df13e923aa04ed0dd75aa5cf3f9627ed5e60a2218887c43"} Jan 30 21:36:02 crc kubenswrapper[4914]: I0130 21:36:02.980622 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-h8vzt" podStartSLOduration=2.187053067 podStartE2EDuration="10.9806046s" podCreationTimestamp="2026-01-30 21:35:52 +0000 UTC" firstStartedPulling="2026-01-30 21:35:53.536654831 +0000 UTC m=+1286.975291582" lastFinishedPulling="2026-01-30 21:36:02.330206344 +0000 UTC m=+1295.768843115" observedRunningTime="2026-01-30 21:36:02.976989753 +0000 UTC m=+1296.415626544" watchObservedRunningTime="2026-01-30 21:36:02.9806046 +0000 UTC m=+1296.419241361" Jan 30 21:36:03 crc kubenswrapper[4914]: I0130 21:36:03.553401 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 21:36:03 crc kubenswrapper[4914]: I0130 21:36:03.553461 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 21:36:03 crc kubenswrapper[4914]: I0130 21:36:03.588281 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 21:36:03 crc kubenswrapper[4914]: I0130 21:36:03.597914 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 21:36:03 crc kubenswrapper[4914]: I0130 21:36:03.975770 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 21:36:03 crc kubenswrapper[4914]: I0130 21:36:03.975802 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 21:36:04 crc kubenswrapper[4914]: I0130 21:36:04.615971 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 21:36:04 crc kubenswrapper[4914]: I0130 21:36:04.616366 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 21:36:04 crc kubenswrapper[4914]: I0130 21:36:04.670206 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 21:36:04 crc kubenswrapper[4914]: I0130 21:36:04.675047 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 21:36:04 crc kubenswrapper[4914]: I0130 21:36:04.982661 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 21:36:04 crc kubenswrapper[4914]: I0130 21:36:04.982719 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 21:36:05 crc kubenswrapper[4914]: I0130 21:36:05.304133 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:36:05 crc kubenswrapper[4914]: I0130 21:36:05.937502 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 21:36:05 crc kubenswrapper[4914]: I0130 21:36:05.943143 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 21:36:06 crc kubenswrapper[4914]: I0130 21:36:06.002297 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="ceilometer-central-agent" containerID="cri-o://71acee96adf3a1a43380a6c77086f652d72a339d33b4748f21a9fcda7b900d3c" gracePeriod=30 Jan 30 21:36:06 crc kubenswrapper[4914]: I0130 21:36:06.002724 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2cd0ddef-4517-43be-843d-e578319733d0","Type":"ContainerStarted","Data":"551da97a40dc79605d13ed7103fcb8c285ff1bf8f79558fe06812df038ec9f99"} Jan 30 21:36:06 crc kubenswrapper[4914]: I0130 21:36:06.004002 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 21:36:06 crc kubenswrapper[4914]: I0130 21:36:06.003825 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="sg-core" containerID="cri-o://fc93cb161f2087f47df13e923aa04ed0dd75aa5cf3f9627ed5e60a2218887c43" gracePeriod=30 Jan 30 21:36:06 crc kubenswrapper[4914]: I0130 21:36:06.003845 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="ceilometer-notification-agent" containerID="cri-o://0e1059693a39abe2197c88da5e4b7fef209bcd46abe40c5601321ba31280fd84" gracePeriod=30 Jan 30 21:36:06 crc kubenswrapper[4914]: I0130 21:36:06.003815 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="proxy-httpd" containerID="cri-o://551da97a40dc79605d13ed7103fcb8c285ff1bf8f79558fe06812df038ec9f99" gracePeriod=30 Jan 30 21:36:06 crc kubenswrapper[4914]: I0130 21:36:06.050417 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.870724634 podStartE2EDuration="12.050393452s" podCreationTimestamp="2026-01-30 21:35:54 +0000 UTC" firstStartedPulling="2026-01-30 21:35:55.062575778 +0000 UTC m=+1288.501212539" lastFinishedPulling="2026-01-30 21:36:05.242244596 +0000 UTC m=+1298.680881357" observedRunningTime="2026-01-30 21:36:06.037942863 +0000 UTC m=+1299.476579644" watchObservedRunningTime="2026-01-30 21:36:06.050393452 +0000 UTC m=+1299.489030213" Jan 30 21:36:06 crc kubenswrapper[4914]: I0130 21:36:06.996918 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 21:36:07 crc kubenswrapper[4914]: I0130 21:36:07.012901 4914 generic.go:334] "Generic (PLEG): container finished" podID="2cd0ddef-4517-43be-843d-e578319733d0" containerID="551da97a40dc79605d13ed7103fcb8c285ff1bf8f79558fe06812df038ec9f99" exitCode=0 Jan 30 21:36:07 crc kubenswrapper[4914]: I0130 21:36:07.012956 4914 generic.go:334] "Generic (PLEG): container finished" podID="2cd0ddef-4517-43be-843d-e578319733d0" containerID="fc93cb161f2087f47df13e923aa04ed0dd75aa5cf3f9627ed5e60a2218887c43" exitCode=2 Jan 30 21:36:07 crc kubenswrapper[4914]: I0130 21:36:07.012968 4914 generic.go:334] "Generic (PLEG): container finished" podID="2cd0ddef-4517-43be-843d-e578319733d0" containerID="71acee96adf3a1a43380a6c77086f652d72a339d33b4748f21a9fcda7b900d3c" exitCode=0 Jan 30 21:36:07 crc kubenswrapper[4914]: I0130 21:36:07.012963 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2cd0ddef-4517-43be-843d-e578319733d0","Type":"ContainerDied","Data":"551da97a40dc79605d13ed7103fcb8c285ff1bf8f79558fe06812df038ec9f99"} Jan 30 21:36:07 crc kubenswrapper[4914]: I0130 21:36:07.013005 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2cd0ddef-4517-43be-843d-e578319733d0","Type":"ContainerDied","Data":"fc93cb161f2087f47df13e923aa04ed0dd75aa5cf3f9627ed5e60a2218887c43"} Jan 30 21:36:07 crc kubenswrapper[4914]: I0130 21:36:07.013016 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2cd0ddef-4517-43be-843d-e578319733d0","Type":"ContainerDied","Data":"71acee96adf3a1a43380a6c77086f652d72a339d33b4748f21a9fcda7b900d3c"} Jan 30 21:36:07 crc kubenswrapper[4914]: I0130 21:36:07.013118 4914 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:36:07 crc kubenswrapper[4914]: I0130 21:36:07.058377 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.028291 4914 generic.go:334] "Generic (PLEG): container finished" podID="2cd0ddef-4517-43be-843d-e578319733d0" containerID="0e1059693a39abe2197c88da5e4b7fef209bcd46abe40c5601321ba31280fd84" exitCode=0 Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.029251 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2cd0ddef-4517-43be-843d-e578319733d0","Type":"ContainerDied","Data":"0e1059693a39abe2197c88da5e4b7fef209bcd46abe40c5601321ba31280fd84"} Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.202257 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.341655 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cd0ddef-4517-43be-843d-e578319733d0-log-httpd\") pod \"2cd0ddef-4517-43be-843d-e578319733d0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.341741 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-sg-core-conf-yaml\") pod \"2cd0ddef-4517-43be-843d-e578319733d0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.341797 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-combined-ca-bundle\") pod \"2cd0ddef-4517-43be-843d-e578319733d0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.341828 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5sxr\" (UniqueName: \"kubernetes.io/projected/2cd0ddef-4517-43be-843d-e578319733d0-kube-api-access-z5sxr\") pod \"2cd0ddef-4517-43be-843d-e578319733d0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.341861 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cd0ddef-4517-43be-843d-e578319733d0-run-httpd\") pod \"2cd0ddef-4517-43be-843d-e578319733d0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.341940 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-scripts\") pod \"2cd0ddef-4517-43be-843d-e578319733d0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.341982 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-ceilometer-tls-certs\") pod \"2cd0ddef-4517-43be-843d-e578319733d0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.342202 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-config-data\") pod \"2cd0ddef-4517-43be-843d-e578319733d0\" (UID: \"2cd0ddef-4517-43be-843d-e578319733d0\") " Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.342269 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd0ddef-4517-43be-843d-e578319733d0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2cd0ddef-4517-43be-843d-e578319733d0" (UID: "2cd0ddef-4517-43be-843d-e578319733d0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.342460 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd0ddef-4517-43be-843d-e578319733d0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2cd0ddef-4517-43be-843d-e578319733d0" (UID: "2cd0ddef-4517-43be-843d-e578319733d0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.347805 4914 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cd0ddef-4517-43be-843d-e578319733d0-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.347837 4914 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cd0ddef-4517-43be-843d-e578319733d0-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.353049 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-scripts" (OuterVolumeSpecName: "scripts") pod "2cd0ddef-4517-43be-843d-e578319733d0" (UID: "2cd0ddef-4517-43be-843d-e578319733d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.353118 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd0ddef-4517-43be-843d-e578319733d0-kube-api-access-z5sxr" (OuterVolumeSpecName: "kube-api-access-z5sxr") pod "2cd0ddef-4517-43be-843d-e578319733d0" (UID: "2cd0ddef-4517-43be-843d-e578319733d0"). InnerVolumeSpecName "kube-api-access-z5sxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.390792 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2cd0ddef-4517-43be-843d-e578319733d0" (UID: "2cd0ddef-4517-43be-843d-e578319733d0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.413088 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2cd0ddef-4517-43be-843d-e578319733d0" (UID: "2cd0ddef-4517-43be-843d-e578319733d0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.450293 4914 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.450333 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5sxr\" (UniqueName: \"kubernetes.io/projected/2cd0ddef-4517-43be-843d-e578319733d0-kube-api-access-z5sxr\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.450348 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.450360 4914 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.456939 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cd0ddef-4517-43be-843d-e578319733d0" (UID: "2cd0ddef-4517-43be-843d-e578319733d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.472583 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-config-data" (OuterVolumeSpecName: "config-data") pod "2cd0ddef-4517-43be-843d-e578319733d0" (UID: "2cd0ddef-4517-43be-843d-e578319733d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.552239 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:08 crc kubenswrapper[4914]: I0130 21:36:08.552280 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd0ddef-4517-43be-843d-e578319733d0-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.042090 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2cd0ddef-4517-43be-843d-e578319733d0","Type":"ContainerDied","Data":"cc3bcf675a0bdb847d55aa30d60da3c432500fbda9920f0ecd9f0ad02d148426"} Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.042132 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.042157 4914 scope.go:117] "RemoveContainer" containerID="551da97a40dc79605d13ed7103fcb8c285ff1bf8f79558fe06812df038ec9f99" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.079691 4914 scope.go:117] "RemoveContainer" containerID="fc93cb161f2087f47df13e923aa04ed0dd75aa5cf3f9627ed5e60a2218887c43" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.094448 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.109802 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.121175 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:36:09 crc kubenswrapper[4914]: E0130 21:36:09.121586 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="proxy-httpd" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.121604 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="proxy-httpd" Jan 30 21:36:09 crc kubenswrapper[4914]: E0130 21:36:09.121616 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="ceilometer-notification-agent" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.121624 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="ceilometer-notification-agent" Jan 30 21:36:09 crc kubenswrapper[4914]: E0130 21:36:09.121639 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="ceilometer-central-agent" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.121645 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="ceilometer-central-agent" Jan 30 21:36:09 crc kubenswrapper[4914]: E0130 21:36:09.121655 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="sg-core" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.121661 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="sg-core" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.121854 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="sg-core" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.121878 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="proxy-httpd" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.121889 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="ceilometer-central-agent" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.121897 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd0ddef-4517-43be-843d-e578319733d0" containerName="ceilometer-notification-agent" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.123677 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.131448 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.131625 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.132443 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.136228 4914 scope.go:117] "RemoveContainer" containerID="0e1059693a39abe2197c88da5e4b7fef209bcd46abe40c5601321ba31280fd84" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.172653 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.175910 4914 scope.go:117] "RemoveContainer" containerID="71acee96adf3a1a43380a6c77086f652d72a339d33b4748f21a9fcda7b900d3c" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.271081 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw4rn\" (UniqueName: \"kubernetes.io/projected/fc7d74fc-261a-45d8-befe-65efa353a56c-kube-api-access-tw4rn\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.271134 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.271214 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.271241 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.271257 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-config-data\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.271280 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-scripts\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.271300 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc7d74fc-261a-45d8-befe-65efa353a56c-run-httpd\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.271325 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc7d74fc-261a-45d8-befe-65efa353a56c-log-httpd\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.372984 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.373032 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-config-data\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.373071 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-scripts\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.373094 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc7d74fc-261a-45d8-befe-65efa353a56c-run-httpd\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.373132 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc7d74fc-261a-45d8-befe-65efa353a56c-log-httpd\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.373303 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw4rn\" (UniqueName: \"kubernetes.io/projected/fc7d74fc-261a-45d8-befe-65efa353a56c-kube-api-access-tw4rn\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.373326 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.373405 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.374210 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc7d74fc-261a-45d8-befe-65efa353a56c-log-httpd\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.374234 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc7d74fc-261a-45d8-befe-65efa353a56c-run-httpd\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.378038 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.379747 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-config-data\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.381492 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.382070 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.383402 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-scripts\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.393639 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw4rn\" (UniqueName: \"kubernetes.io/projected/fc7d74fc-261a-45d8-befe-65efa353a56c-kube-api-access-tw4rn\") pod \"ceilometer-0\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.449676 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.830418 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd0ddef-4517-43be-843d-e578319733d0" path="/var/lib/kubelet/pods/2cd0ddef-4517-43be-843d-e578319733d0/volumes" Jan 30 21:36:09 crc kubenswrapper[4914]: W0130 21:36:09.936239 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc7d74fc_261a_45d8_befe_65efa353a56c.slice/crio-9b99c8eadff477b9434512c29f4b0d6e82c86b8de253ebde4ba574370bf93db8 WatchSource:0}: Error finding container 9b99c8eadff477b9434512c29f4b0d6e82c86b8de253ebde4ba574370bf93db8: Status 404 returned error can't find the container with id 9b99c8eadff477b9434512c29f4b0d6e82c86b8de253ebde4ba574370bf93db8 Jan 30 21:36:09 crc kubenswrapper[4914]: I0130 21:36:09.937558 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:36:10 crc kubenswrapper[4914]: I0130 21:36:10.051399 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc7d74fc-261a-45d8-befe-65efa353a56c","Type":"ContainerStarted","Data":"9b99c8eadff477b9434512c29f4b0d6e82c86b8de253ebde4ba574370bf93db8"} Jan 30 21:36:12 crc kubenswrapper[4914]: I0130 21:36:12.069461 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc7d74fc-261a-45d8-befe-65efa353a56c","Type":"ContainerStarted","Data":"a4933dd9f116bf683272a00e6de2522e0fba09fa37e0db927f194526fc8d3e98"} Jan 30 21:36:14 crc kubenswrapper[4914]: I0130 21:36:14.105047 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc7d74fc-261a-45d8-befe-65efa353a56c","Type":"ContainerStarted","Data":"b268b490bb1c30992868c5646cd579ab7e9607b87139e46b7232a607e93957af"} Jan 30 21:36:15 crc kubenswrapper[4914]: I0130 21:36:15.119636 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc7d74fc-261a-45d8-befe-65efa353a56c","Type":"ContainerStarted","Data":"b5327fceb0a1cd0d3ae3e02fd6a2439c617db89c52a4baa4e7c585202c503ee1"} Jan 30 21:36:16 crc kubenswrapper[4914]: I0130 21:36:16.949016 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:36:21 crc kubenswrapper[4914]: I0130 21:36:21.189804 4914 generic.go:334] "Generic (PLEG): container finished" podID="39e5479a-5cbf-479b-b3b9-3af2a3424492" containerID="0de72de9e1c09ecfa7bb5ad9c1b252f5a872040b65cefa8c01285030bee47686" exitCode=0 Jan 30 21:36:21 crc kubenswrapper[4914]: I0130 21:36:21.189840 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h8vzt" event={"ID":"39e5479a-5cbf-479b-b3b9-3af2a3424492","Type":"ContainerDied","Data":"0de72de9e1c09ecfa7bb5ad9c1b252f5a872040b65cefa8c01285030bee47686"} Jan 30 21:36:21 crc kubenswrapper[4914]: I0130 21:36:21.193081 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc7d74fc-261a-45d8-befe-65efa353a56c","Type":"ContainerStarted","Data":"81deb208abb67bd322a1f14bc5b5aadee0982fd2bd8f2cae2e9457e57f45f833"} Jan 30 21:36:21 crc kubenswrapper[4914]: I0130 21:36:21.193268 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="ceilometer-central-agent" containerID="cri-o://a4933dd9f116bf683272a00e6de2522e0fba09fa37e0db927f194526fc8d3e98" gracePeriod=30 Jan 30 21:36:21 crc kubenswrapper[4914]: I0130 21:36:21.193424 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 21:36:21 crc kubenswrapper[4914]: I0130 21:36:21.193485 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="proxy-httpd" containerID="cri-o://81deb208abb67bd322a1f14bc5b5aadee0982fd2bd8f2cae2e9457e57f45f833" gracePeriod=30 Jan 30 21:36:21 crc kubenswrapper[4914]: I0130 21:36:21.193523 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="ceilometer-notification-agent" containerID="cri-o://b268b490bb1c30992868c5646cd579ab7e9607b87139e46b7232a607e93957af" gracePeriod=30 Jan 30 21:36:21 crc kubenswrapper[4914]: I0130 21:36:21.193620 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="sg-core" containerID="cri-o://b5327fceb0a1cd0d3ae3e02fd6a2439c617db89c52a4baa4e7c585202c503ee1" gracePeriod=30 Jan 30 21:36:21 crc kubenswrapper[4914]: I0130 21:36:21.246682 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.494663181 podStartE2EDuration="12.246662544s" podCreationTimestamp="2026-01-30 21:36:09 +0000 UTC" firstStartedPulling="2026-01-30 21:36:09.939298313 +0000 UTC m=+1303.377935074" lastFinishedPulling="2026-01-30 21:36:20.691297676 +0000 UTC m=+1314.129934437" observedRunningTime="2026-01-30 21:36:21.242794332 +0000 UTC m=+1314.681431123" watchObservedRunningTime="2026-01-30 21:36:21.246662544 +0000 UTC m=+1314.685299305" Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.214913 4914 generic.go:334] "Generic (PLEG): container finished" podID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerID="81deb208abb67bd322a1f14bc5b5aadee0982fd2bd8f2cae2e9457e57f45f833" exitCode=0 Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.214971 4914 generic.go:334] "Generic (PLEG): container finished" podID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerID="b5327fceb0a1cd0d3ae3e02fd6a2439c617db89c52a4baa4e7c585202c503ee1" exitCode=2 Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.214993 4914 generic.go:334] "Generic (PLEG): container finished" podID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerID="b268b490bb1c30992868c5646cd579ab7e9607b87139e46b7232a607e93957af" exitCode=0 Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.217929 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc7d74fc-261a-45d8-befe-65efa353a56c","Type":"ContainerDied","Data":"81deb208abb67bd322a1f14bc5b5aadee0982fd2bd8f2cae2e9457e57f45f833"} Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.218539 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc7d74fc-261a-45d8-befe-65efa353a56c","Type":"ContainerDied","Data":"b5327fceb0a1cd0d3ae3e02fd6a2439c617db89c52a4baa4e7c585202c503ee1"} Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.218793 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc7d74fc-261a-45d8-befe-65efa353a56c","Type":"ContainerDied","Data":"b268b490bb1c30992868c5646cd579ab7e9607b87139e46b7232a607e93957af"} Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.698038 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.794466 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-combined-ca-bundle\") pod \"39e5479a-5cbf-479b-b3b9-3af2a3424492\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.794585 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-scripts\") pod \"39e5479a-5cbf-479b-b3b9-3af2a3424492\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.794671 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgsnx\" (UniqueName: \"kubernetes.io/projected/39e5479a-5cbf-479b-b3b9-3af2a3424492-kube-api-access-sgsnx\") pod \"39e5479a-5cbf-479b-b3b9-3af2a3424492\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.794910 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-config-data\") pod \"39e5479a-5cbf-479b-b3b9-3af2a3424492\" (UID: \"39e5479a-5cbf-479b-b3b9-3af2a3424492\") " Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.804864 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39e5479a-5cbf-479b-b3b9-3af2a3424492-kube-api-access-sgsnx" (OuterVolumeSpecName: "kube-api-access-sgsnx") pod "39e5479a-5cbf-479b-b3b9-3af2a3424492" (UID: "39e5479a-5cbf-479b-b3b9-3af2a3424492"). InnerVolumeSpecName "kube-api-access-sgsnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.823718 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-scripts" (OuterVolumeSpecName: "scripts") pod "39e5479a-5cbf-479b-b3b9-3af2a3424492" (UID: "39e5479a-5cbf-479b-b3b9-3af2a3424492"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.828467 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.828492 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgsnx\" (UniqueName: \"kubernetes.io/projected/39e5479a-5cbf-479b-b3b9-3af2a3424492-kube-api-access-sgsnx\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.833106 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39e5479a-5cbf-479b-b3b9-3af2a3424492" (UID: "39e5479a-5cbf-479b-b3b9-3af2a3424492"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.834173 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-config-data" (OuterVolumeSpecName: "config-data") pod "39e5479a-5cbf-479b-b3b9-3af2a3424492" (UID: "39e5479a-5cbf-479b-b3b9-3af2a3424492"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.931420 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:22 crc kubenswrapper[4914]: I0130 21:36:22.931469 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e5479a-5cbf-479b-b3b9-3af2a3424492-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.230977 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h8vzt" event={"ID":"39e5479a-5cbf-479b-b3b9-3af2a3424492","Type":"ContainerDied","Data":"ed57e5c786900b84549119a3925a4f9c7d6b8c4bf040dcbe0fb5f21012c7e9a2"} Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.231248 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed57e5c786900b84549119a3925a4f9c7d6b8c4bf040dcbe0fb5f21012c7e9a2" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.231081 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h8vzt" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.352287 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 21:36:23 crc kubenswrapper[4914]: E0130 21:36:23.352809 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e5479a-5cbf-479b-b3b9-3af2a3424492" containerName="nova-cell0-conductor-db-sync" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.352837 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e5479a-5cbf-479b-b3b9-3af2a3424492" containerName="nova-cell0-conductor-db-sync" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.353099 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="39e5479a-5cbf-479b-b3b9-3af2a3424492" containerName="nova-cell0-conductor-db-sync" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.353997 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.357550 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.357638 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kmd89" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.364819 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.442994 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pk5b\" (UniqueName: \"kubernetes.io/projected/8c466dcf-34cd-44fc-9516-16b7ec5cb492-kube-api-access-8pk5b\") pod \"nova-cell0-conductor-0\" (UID: \"8c466dcf-34cd-44fc-9516-16b7ec5cb492\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.443053 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c466dcf-34cd-44fc-9516-16b7ec5cb492-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8c466dcf-34cd-44fc-9516-16b7ec5cb492\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.443090 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c466dcf-34cd-44fc-9516-16b7ec5cb492-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8c466dcf-34cd-44fc-9516-16b7ec5cb492\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.544931 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pk5b\" (UniqueName: \"kubernetes.io/projected/8c466dcf-34cd-44fc-9516-16b7ec5cb492-kube-api-access-8pk5b\") pod \"nova-cell0-conductor-0\" (UID: \"8c466dcf-34cd-44fc-9516-16b7ec5cb492\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.544966 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c466dcf-34cd-44fc-9516-16b7ec5cb492-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8c466dcf-34cd-44fc-9516-16b7ec5cb492\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.544986 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c466dcf-34cd-44fc-9516-16b7ec5cb492-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8c466dcf-34cd-44fc-9516-16b7ec5cb492\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.549768 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c466dcf-34cd-44fc-9516-16b7ec5cb492-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8c466dcf-34cd-44fc-9516-16b7ec5cb492\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.552760 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c466dcf-34cd-44fc-9516-16b7ec5cb492-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8c466dcf-34cd-44fc-9516-16b7ec5cb492\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.563207 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pk5b\" (UniqueName: \"kubernetes.io/projected/8c466dcf-34cd-44fc-9516-16b7ec5cb492-kube-api-access-8pk5b\") pod \"nova-cell0-conductor-0\" (UID: \"8c466dcf-34cd-44fc-9516-16b7ec5cb492\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:36:23 crc kubenswrapper[4914]: I0130 21:36:23.676600 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.022399 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.157297 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-sg-core-conf-yaml\") pod \"fc7d74fc-261a-45d8-befe-65efa353a56c\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.157392 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw4rn\" (UniqueName: \"kubernetes.io/projected/fc7d74fc-261a-45d8-befe-65efa353a56c-kube-api-access-tw4rn\") pod \"fc7d74fc-261a-45d8-befe-65efa353a56c\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.157438 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-ceilometer-tls-certs\") pod \"fc7d74fc-261a-45d8-befe-65efa353a56c\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.157478 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc7d74fc-261a-45d8-befe-65efa353a56c-run-httpd\") pod \"fc7d74fc-261a-45d8-befe-65efa353a56c\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.157514 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-scripts\") pod \"fc7d74fc-261a-45d8-befe-65efa353a56c\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.157564 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc7d74fc-261a-45d8-befe-65efa353a56c-log-httpd\") pod \"fc7d74fc-261a-45d8-befe-65efa353a56c\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.157617 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-combined-ca-bundle\") pod \"fc7d74fc-261a-45d8-befe-65efa353a56c\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.157697 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-config-data\") pod \"fc7d74fc-261a-45d8-befe-65efa353a56c\" (UID: \"fc7d74fc-261a-45d8-befe-65efa353a56c\") " Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.158152 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc7d74fc-261a-45d8-befe-65efa353a56c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fc7d74fc-261a-45d8-befe-65efa353a56c" (UID: "fc7d74fc-261a-45d8-befe-65efa353a56c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.159077 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc7d74fc-261a-45d8-befe-65efa353a56c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fc7d74fc-261a-45d8-befe-65efa353a56c" (UID: "fc7d74fc-261a-45d8-befe-65efa353a56c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.175565 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-scripts" (OuterVolumeSpecName: "scripts") pod "fc7d74fc-261a-45d8-befe-65efa353a56c" (UID: "fc7d74fc-261a-45d8-befe-65efa353a56c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.181863 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc7d74fc-261a-45d8-befe-65efa353a56c-kube-api-access-tw4rn" (OuterVolumeSpecName: "kube-api-access-tw4rn") pod "fc7d74fc-261a-45d8-befe-65efa353a56c" (UID: "fc7d74fc-261a-45d8-befe-65efa353a56c"). InnerVolumeSpecName "kube-api-access-tw4rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.189940 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.192692 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fc7d74fc-261a-45d8-befe-65efa353a56c" (UID: "fc7d74fc-261a-45d8-befe-65efa353a56c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:24 crc kubenswrapper[4914]: W0130 21:36:24.196125 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c466dcf_34cd_44fc_9516_16b7ec5cb492.slice/crio-c9c014f3e500b2f2f99ba489d9e955497a259dce13dc435407a2eedf6c9cda32 WatchSource:0}: Error finding container c9c014f3e500b2f2f99ba489d9e955497a259dce13dc435407a2eedf6c9cda32: Status 404 returned error can't find the container with id c9c014f3e500b2f2f99ba489d9e955497a259dce13dc435407a2eedf6c9cda32 Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.238029 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "fc7d74fc-261a-45d8-befe-65efa353a56c" (UID: "fc7d74fc-261a-45d8-befe-65efa353a56c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.247609 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8c466dcf-34cd-44fc-9516-16b7ec5cb492","Type":"ContainerStarted","Data":"c9c014f3e500b2f2f99ba489d9e955497a259dce13dc435407a2eedf6c9cda32"} Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.250172 4914 generic.go:334] "Generic (PLEG): container finished" podID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerID="a4933dd9f116bf683272a00e6de2522e0fba09fa37e0db927f194526fc8d3e98" exitCode=0 Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.250206 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc7d74fc-261a-45d8-befe-65efa353a56c","Type":"ContainerDied","Data":"a4933dd9f116bf683272a00e6de2522e0fba09fa37e0db927f194526fc8d3e98"} Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.250227 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc7d74fc-261a-45d8-befe-65efa353a56c","Type":"ContainerDied","Data":"9b99c8eadff477b9434512c29f4b0d6e82c86b8de253ebde4ba574370bf93db8"} Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.250246 4914 scope.go:117] "RemoveContainer" containerID="81deb208abb67bd322a1f14bc5b5aadee0982fd2bd8f2cae2e9457e57f45f833" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.250351 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.260051 4914 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.260077 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw4rn\" (UniqueName: \"kubernetes.io/projected/fc7d74fc-261a-45d8-befe-65efa353a56c-kube-api-access-tw4rn\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.260088 4914 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.260097 4914 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc7d74fc-261a-45d8-befe-65efa353a56c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.260106 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.260114 4914 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc7d74fc-261a-45d8-befe-65efa353a56c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.305661 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-config-data" (OuterVolumeSpecName: "config-data") pod "fc7d74fc-261a-45d8-befe-65efa353a56c" (UID: "fc7d74fc-261a-45d8-befe-65efa353a56c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.309198 4914 scope.go:117] "RemoveContainer" containerID="b5327fceb0a1cd0d3ae3e02fd6a2439c617db89c52a4baa4e7c585202c503ee1" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.328746 4914 scope.go:117] "RemoveContainer" containerID="b268b490bb1c30992868c5646cd579ab7e9607b87139e46b7232a607e93957af" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.331742 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc7d74fc-261a-45d8-befe-65efa353a56c" (UID: "fc7d74fc-261a-45d8-befe-65efa353a56c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.368625 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.368728 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7d74fc-261a-45d8-befe-65efa353a56c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.370664 4914 scope.go:117] "RemoveContainer" containerID="a4933dd9f116bf683272a00e6de2522e0fba09fa37e0db927f194526fc8d3e98" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.418783 4914 scope.go:117] "RemoveContainer" containerID="81deb208abb67bd322a1f14bc5b5aadee0982fd2bd8f2cae2e9457e57f45f833" Jan 30 21:36:24 crc kubenswrapper[4914]: E0130 21:36:24.420695 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81deb208abb67bd322a1f14bc5b5aadee0982fd2bd8f2cae2e9457e57f45f833\": container with ID starting with 81deb208abb67bd322a1f14bc5b5aadee0982fd2bd8f2cae2e9457e57f45f833 not found: ID does not exist" containerID="81deb208abb67bd322a1f14bc5b5aadee0982fd2bd8f2cae2e9457e57f45f833" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.420771 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81deb208abb67bd322a1f14bc5b5aadee0982fd2bd8f2cae2e9457e57f45f833"} err="failed to get container status \"81deb208abb67bd322a1f14bc5b5aadee0982fd2bd8f2cae2e9457e57f45f833\": rpc error: code = NotFound desc = could not find container \"81deb208abb67bd322a1f14bc5b5aadee0982fd2bd8f2cae2e9457e57f45f833\": container with ID starting with 81deb208abb67bd322a1f14bc5b5aadee0982fd2bd8f2cae2e9457e57f45f833 not found: ID does not exist" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.420808 4914 scope.go:117] "RemoveContainer" containerID="b5327fceb0a1cd0d3ae3e02fd6a2439c617db89c52a4baa4e7c585202c503ee1" Jan 30 21:36:24 crc kubenswrapper[4914]: E0130 21:36:24.421340 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5327fceb0a1cd0d3ae3e02fd6a2439c617db89c52a4baa4e7c585202c503ee1\": container with ID starting with b5327fceb0a1cd0d3ae3e02fd6a2439c617db89c52a4baa4e7c585202c503ee1 not found: ID does not exist" containerID="b5327fceb0a1cd0d3ae3e02fd6a2439c617db89c52a4baa4e7c585202c503ee1" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.421363 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5327fceb0a1cd0d3ae3e02fd6a2439c617db89c52a4baa4e7c585202c503ee1"} err="failed to get container status \"b5327fceb0a1cd0d3ae3e02fd6a2439c617db89c52a4baa4e7c585202c503ee1\": rpc error: code = NotFound desc = could not find container \"b5327fceb0a1cd0d3ae3e02fd6a2439c617db89c52a4baa4e7c585202c503ee1\": container with ID starting with b5327fceb0a1cd0d3ae3e02fd6a2439c617db89c52a4baa4e7c585202c503ee1 not found: ID does not exist" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.421377 4914 scope.go:117] "RemoveContainer" containerID="b268b490bb1c30992868c5646cd579ab7e9607b87139e46b7232a607e93957af" Jan 30 21:36:24 crc kubenswrapper[4914]: E0130 21:36:24.422035 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b268b490bb1c30992868c5646cd579ab7e9607b87139e46b7232a607e93957af\": container with ID starting with b268b490bb1c30992868c5646cd579ab7e9607b87139e46b7232a607e93957af not found: ID does not exist" containerID="b268b490bb1c30992868c5646cd579ab7e9607b87139e46b7232a607e93957af" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.422094 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b268b490bb1c30992868c5646cd579ab7e9607b87139e46b7232a607e93957af"} err="failed to get container status \"b268b490bb1c30992868c5646cd579ab7e9607b87139e46b7232a607e93957af\": rpc error: code = NotFound desc = could not find container \"b268b490bb1c30992868c5646cd579ab7e9607b87139e46b7232a607e93957af\": container with ID starting with b268b490bb1c30992868c5646cd579ab7e9607b87139e46b7232a607e93957af not found: ID does not exist" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.422128 4914 scope.go:117] "RemoveContainer" containerID="a4933dd9f116bf683272a00e6de2522e0fba09fa37e0db927f194526fc8d3e98" Jan 30 21:36:24 crc kubenswrapper[4914]: E0130 21:36:24.422612 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4933dd9f116bf683272a00e6de2522e0fba09fa37e0db927f194526fc8d3e98\": container with ID starting with a4933dd9f116bf683272a00e6de2522e0fba09fa37e0db927f194526fc8d3e98 not found: ID does not exist" containerID="a4933dd9f116bf683272a00e6de2522e0fba09fa37e0db927f194526fc8d3e98" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.422641 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4933dd9f116bf683272a00e6de2522e0fba09fa37e0db927f194526fc8d3e98"} err="failed to get container status \"a4933dd9f116bf683272a00e6de2522e0fba09fa37e0db927f194526fc8d3e98\": rpc error: code = NotFound desc = could not find container \"a4933dd9f116bf683272a00e6de2522e0fba09fa37e0db927f194526fc8d3e98\": container with ID starting with a4933dd9f116bf683272a00e6de2522e0fba09fa37e0db927f194526fc8d3e98 not found: ID does not exist" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.607548 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.616788 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.633162 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:36:24 crc kubenswrapper[4914]: E0130 21:36:24.633738 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="sg-core" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.633760 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="sg-core" Jan 30 21:36:24 crc kubenswrapper[4914]: E0130 21:36:24.633777 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="ceilometer-notification-agent" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.633786 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="ceilometer-notification-agent" Jan 30 21:36:24 crc kubenswrapper[4914]: E0130 21:36:24.633798 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="ceilometer-central-agent" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.633804 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="ceilometer-central-agent" Jan 30 21:36:24 crc kubenswrapper[4914]: E0130 21:36:24.633812 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="proxy-httpd" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.633818 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="proxy-httpd" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.634004 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="proxy-httpd" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.634016 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="sg-core" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.634032 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="ceilometer-central-agent" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.634048 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" containerName="ceilometer-notification-agent" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.636296 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.641340 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.653761 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.653936 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.654063 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.775416 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9616e365-c35d-4adb-96e5-81c1e34c7068-log-httpd\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.775480 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-config-data\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.775565 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.775777 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgd8p\" (UniqueName: \"kubernetes.io/projected/9616e365-c35d-4adb-96e5-81c1e34c7068-kube-api-access-bgd8p\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.775894 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.775981 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.776039 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9616e365-c35d-4adb-96e5-81c1e34c7068-run-httpd\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.776087 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-scripts\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.877815 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.877891 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9616e365-c35d-4adb-96e5-81c1e34c7068-run-httpd\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.877966 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-scripts\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.878113 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9616e365-c35d-4adb-96e5-81c1e34c7068-log-httpd\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.878161 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-config-data\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.878239 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.878328 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgd8p\" (UniqueName: \"kubernetes.io/projected/9616e365-c35d-4adb-96e5-81c1e34c7068-kube-api-access-bgd8p\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.878379 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.878813 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9616e365-c35d-4adb-96e5-81c1e34c7068-log-httpd\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.878864 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9616e365-c35d-4adb-96e5-81c1e34c7068-run-httpd\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.882391 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.882870 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.884378 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-scripts\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.893619 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgd8p\" (UniqueName: \"kubernetes.io/projected/9616e365-c35d-4adb-96e5-81c1e34c7068-kube-api-access-bgd8p\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.900425 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.900800 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-config-data\") pod \"ceilometer-0\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " pod="openstack/ceilometer-0" Jan 30 21:36:24 crc kubenswrapper[4914]: I0130 21:36:24.957791 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:36:25 crc kubenswrapper[4914]: I0130 21:36:25.262378 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8c466dcf-34cd-44fc-9516-16b7ec5cb492","Type":"ContainerStarted","Data":"0c1feac1e6b5fcb68c5e3e1149d82fb34c12de05e9266f4151ae90ae35a88666"} Jan 30 21:36:25 crc kubenswrapper[4914]: I0130 21:36:25.262645 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 30 21:36:25 crc kubenswrapper[4914]: I0130 21:36:25.414196 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.414177732 podStartE2EDuration="2.414177732s" podCreationTimestamp="2026-01-30 21:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:25.28056561 +0000 UTC m=+1318.719202391" watchObservedRunningTime="2026-01-30 21:36:25.414177732 +0000 UTC m=+1318.852814493" Jan 30 21:36:25 crc kubenswrapper[4914]: I0130 21:36:25.418827 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:36:25 crc kubenswrapper[4914]: W0130 21:36:25.422908 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9616e365_c35d_4adb_96e5_81c1e34c7068.slice/crio-572158e158ce20e5570b482ce8586130f37943ff6bebe1243dbbd358d75405d7 WatchSource:0}: Error finding container 572158e158ce20e5570b482ce8586130f37943ff6bebe1243dbbd358d75405d7: Status 404 returned error can't find the container with id 572158e158ce20e5570b482ce8586130f37943ff6bebe1243dbbd358d75405d7 Jan 30 21:36:25 crc kubenswrapper[4914]: I0130 21:36:25.833422 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc7d74fc-261a-45d8-befe-65efa353a56c" path="/var/lib/kubelet/pods/fc7d74fc-261a-45d8-befe-65efa353a56c/volumes" Jan 30 21:36:26 crc kubenswrapper[4914]: I0130 21:36:26.276809 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9616e365-c35d-4adb-96e5-81c1e34c7068","Type":"ContainerStarted","Data":"572158e158ce20e5570b482ce8586130f37943ff6bebe1243dbbd358d75405d7"} Jan 30 21:36:27 crc kubenswrapper[4914]: I0130 21:36:27.287703 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9616e365-c35d-4adb-96e5-81c1e34c7068","Type":"ContainerStarted","Data":"e2fe73ee6674bb7593373e1f13d591c36ac34a0957725a6b00c30c1d228f2531"} Jan 30 21:36:27 crc kubenswrapper[4914]: I0130 21:36:27.289005 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9616e365-c35d-4adb-96e5-81c1e34c7068","Type":"ContainerStarted","Data":"6ee41713bca317d7950e050803df4f6804517538bfdb6dd6e6b456e5eb980a31"} Jan 30 21:36:28 crc kubenswrapper[4914]: I0130 21:36:28.306239 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9616e365-c35d-4adb-96e5-81c1e34c7068","Type":"ContainerStarted","Data":"1082f0a1f0010cc8e2cddfff1d876a7c3d03e0fad6ec742bb9926dc201abce9e"} Jan 30 21:36:30 crc kubenswrapper[4914]: I0130 21:36:30.340351 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9616e365-c35d-4adb-96e5-81c1e34c7068","Type":"ContainerStarted","Data":"78f82eb10120432a9f7faa25694bd8c695ddcb39405158020a62c34657ca7a59"} Jan 30 21:36:30 crc kubenswrapper[4914]: I0130 21:36:30.341164 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 21:36:30 crc kubenswrapper[4914]: I0130 21:36:30.368657 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.864186085 podStartE2EDuration="6.368626086s" podCreationTimestamp="2026-01-30 21:36:24 +0000 UTC" firstStartedPulling="2026-01-30 21:36:25.426844545 +0000 UTC m=+1318.865481306" lastFinishedPulling="2026-01-30 21:36:29.931284546 +0000 UTC m=+1323.369921307" observedRunningTime="2026-01-30 21:36:30.36338369 +0000 UTC m=+1323.802020491" watchObservedRunningTime="2026-01-30 21:36:30.368626086 +0000 UTC m=+1323.807262887" Jan 30 21:36:33 crc kubenswrapper[4914]: I0130 21:36:33.708123 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.347961 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-jwsg7"] Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.349244 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.362812 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jwsg7"] Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.390389 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.390441 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.491137 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-config-data\") pod \"nova-cell0-cell-mapping-jwsg7\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.492084 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-scripts\") pod \"nova-cell0-cell-mapping-jwsg7\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.492336 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q92pf\" (UniqueName: \"kubernetes.io/projected/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-kube-api-access-q92pf\") pod \"nova-cell0-cell-mapping-jwsg7\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.492379 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jwsg7\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.560754 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.562120 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.564477 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.588781 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.594531 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q92pf\" (UniqueName: \"kubernetes.io/projected/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-kube-api-access-q92pf\") pod \"nova-cell0-cell-mapping-jwsg7\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.594587 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jwsg7\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.594698 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-config-data\") pod \"nova-cell0-cell-mapping-jwsg7\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.594767 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-scripts\") pod \"nova-cell0-cell-mapping-jwsg7\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.602658 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jwsg7\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.608802 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-scripts\") pod \"nova-cell0-cell-mapping-jwsg7\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.621377 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-config-data\") pod \"nova-cell0-cell-mapping-jwsg7\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.623324 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q92pf\" (UniqueName: \"kubernetes.io/projected/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-kube-api-access-q92pf\") pod \"nova-cell0-cell-mapping-jwsg7\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.642444 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.644901 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.649175 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.660375 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.692651 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.694037 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.696808 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.697019 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.697118 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.697173 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9tm4\" (UniqueName: \"kubernetes.io/projected/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-kube-api-access-g9tm4\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.716819 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.752512 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.799416 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-config-data\") pod \"nova-scheduler-0\" (UID: \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.799466 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6c534e-f1a8-412b-9a62-df7cb562d938-logs\") pod \"nova-api-0\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " pod="openstack/nova-api-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.799680 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.799840 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bht6z\" (UniqueName: \"kubernetes.io/projected/3f6c534e-f1a8-412b-9a62-df7cb562d938-kube-api-access-bht6z\") pod \"nova-api-0\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " pod="openstack/nova-api-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.799934 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6c534e-f1a8-412b-9a62-df7cb562d938-config-data\") pod \"nova-api-0\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " pod="openstack/nova-api-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.800077 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.800126 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26l24\" (UniqueName: \"kubernetes.io/projected/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-kube-api-access-26l24\") pod \"nova-scheduler-0\" (UID: \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.800197 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9tm4\" (UniqueName: \"kubernetes.io/projected/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-kube-api-access-g9tm4\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.800272 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.800312 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6c534e-f1a8-412b-9a62-df7cb562d938-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " pod="openstack/nova-api-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.806522 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.818798 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.835399 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.836976 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.842444 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.852349 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9tm4\" (UniqueName: \"kubernetes.io/projected/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-kube-api-access-g9tm4\") pod \"nova-cell1-novncproxy-0\" (UID: \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.885676 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.905773 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26l24\" (UniqueName: \"kubernetes.io/projected/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-kube-api-access-26l24\") pod \"nova-scheduler-0\" (UID: \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.905861 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.905891 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6c534e-f1a8-412b-9a62-df7cb562d938-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " pod="openstack/nova-api-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.905948 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-config-data\") pod \"nova-scheduler-0\" (UID: \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.905979 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6c534e-f1a8-412b-9a62-df7cb562d938-logs\") pod \"nova-api-0\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " pod="openstack/nova-api-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.906066 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bht6z\" (UniqueName: \"kubernetes.io/projected/3f6c534e-f1a8-412b-9a62-df7cb562d938-kube-api-access-bht6z\") pod \"nova-api-0\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " pod="openstack/nova-api-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.906132 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6c534e-f1a8-412b-9a62-df7cb562d938-config-data\") pod \"nova-api-0\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " pod="openstack/nova-api-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.907547 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6c534e-f1a8-412b-9a62-df7cb562d938-logs\") pod \"nova-api-0\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " pod="openstack/nova-api-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.908056 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cd565959-jkq7z"] Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.910784 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.915416 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6c534e-f1a8-412b-9a62-df7cb562d938-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " pod="openstack/nova-api-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.917043 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.920396 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-config-data\") pod \"nova-scheduler-0\" (UID: \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.921488 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6c534e-f1a8-412b-9a62-df7cb562d938-config-data\") pod \"nova-api-0\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " pod="openstack/nova-api-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.928847 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26l24\" (UniqueName: \"kubernetes.io/projected/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-kube-api-access-26l24\") pod \"nova-scheduler-0\" (UID: \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.946075 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bht6z\" (UniqueName: \"kubernetes.io/projected/3f6c534e-f1a8-412b-9a62-df7cb562d938-kube-api-access-bht6z\") pod \"nova-api-0\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " pod="openstack/nova-api-0" Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.951651 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-jkq7z"] Jan 30 21:36:34 crc kubenswrapper[4914]: I0130 21:36:34.996990 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.013443 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faaa496b-9482-4183-b22e-a1f0d78bb72b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " pod="openstack/nova-metadata-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.013484 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faaa496b-9482-4183-b22e-a1f0d78bb72b-config-data\") pod \"nova-metadata-0\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " pod="openstack/nova-metadata-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.013606 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wrw2\" (UniqueName: \"kubernetes.io/projected/faaa496b-9482-4183-b22e-a1f0d78bb72b-kube-api-access-7wrw2\") pod \"nova-metadata-0\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " pod="openstack/nova-metadata-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.013639 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faaa496b-9482-4183-b22e-a1f0d78bb72b-logs\") pod \"nova-metadata-0\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " pod="openstack/nova-metadata-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.029840 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.045430 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.115372 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-dns-svc\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.115767 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.115845 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faaa496b-9482-4183-b22e-a1f0d78bb72b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " pod="openstack/nova-metadata-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.115874 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faaa496b-9482-4183-b22e-a1f0d78bb72b-config-data\") pod \"nova-metadata-0\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " pod="openstack/nova-metadata-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.115996 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-config\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.116048 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xgjz\" (UniqueName: \"kubernetes.io/projected/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-kube-api-access-5xgjz\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.116109 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wrw2\" (UniqueName: \"kubernetes.io/projected/faaa496b-9482-4183-b22e-a1f0d78bb72b-kube-api-access-7wrw2\") pod \"nova-metadata-0\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " pod="openstack/nova-metadata-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.116149 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faaa496b-9482-4183-b22e-a1f0d78bb72b-logs\") pod \"nova-metadata-0\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " pod="openstack/nova-metadata-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.116198 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.116227 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.119335 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faaa496b-9482-4183-b22e-a1f0d78bb72b-logs\") pod \"nova-metadata-0\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " pod="openstack/nova-metadata-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.133351 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faaa496b-9482-4183-b22e-a1f0d78bb72b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " pod="openstack/nova-metadata-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.139128 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faaa496b-9482-4183-b22e-a1f0d78bb72b-config-data\") pod \"nova-metadata-0\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " pod="openstack/nova-metadata-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.142325 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wrw2\" (UniqueName: \"kubernetes.io/projected/faaa496b-9482-4183-b22e-a1f0d78bb72b-kube-api-access-7wrw2\") pod \"nova-metadata-0\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " pod="openstack/nova-metadata-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.218646 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-config\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.218729 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xgjz\" (UniqueName: \"kubernetes.io/projected/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-kube-api-access-5xgjz\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.218809 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.218832 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.218893 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-dns-svc\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.218928 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.219600 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-config\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.219693 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.220303 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.220850 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-dns-svc\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.220867 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.244128 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xgjz\" (UniqueName: \"kubernetes.io/projected/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-kube-api-access-5xgjz\") pod \"dnsmasq-dns-78cd565959-jkq7z\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.244590 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.266948 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.484632 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jwsg7"] Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.638132 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.797858 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rq48f"] Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.799357 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.804391 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.809144 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.889697 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rq48f"] Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.953229 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rq48f\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.953296 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhrrr\" (UniqueName: \"kubernetes.io/projected/aa0fec57-7f38-455f-85d0-47b90e552b48-kube-api-access-dhrrr\") pod \"nova-cell1-conductor-db-sync-rq48f\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.953398 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-config-data\") pod \"nova-cell1-conductor-db-sync-rq48f\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.953437 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-scripts\") pod \"nova-cell1-conductor-db-sync-rq48f\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:35 crc kubenswrapper[4914]: I0130 21:36:35.993672 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.065884 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rq48f\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.065929 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhrrr\" (UniqueName: \"kubernetes.io/projected/aa0fec57-7f38-455f-85d0-47b90e552b48-kube-api-access-dhrrr\") pod \"nova-cell1-conductor-db-sync-rq48f\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.065998 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-config-data\") pod \"nova-cell1-conductor-db-sync-rq48f\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.066027 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-scripts\") pod \"nova-cell1-conductor-db-sync-rq48f\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.088614 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhrrr\" (UniqueName: \"kubernetes.io/projected/aa0fec57-7f38-455f-85d0-47b90e552b48-kube-api-access-dhrrr\") pod \"nova-cell1-conductor-db-sync-rq48f\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.091484 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rq48f\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.091792 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-scripts\") pod \"nova-cell1-conductor-db-sync-rq48f\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.093097 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-config-data\") pod \"nova-cell1-conductor-db-sync-rq48f\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.162550 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.208048 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.406328 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-jkq7z"] Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.424728 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.471951 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jwsg7" event={"ID":"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b","Type":"ContainerStarted","Data":"04f6826f8e64a5a3980bb9d298190a1a13ef171bbe8a077a3f54e49b7502d7c0"} Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.471993 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jwsg7" event={"ID":"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b","Type":"ContainerStarted","Data":"a518b3708bedbb42dca26157181173f83c9c07bf5cb6cb160890e801732ed16f"} Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.514484 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0","Type":"ContainerStarted","Data":"48dc8c08272759bfe3e4c83118962687e0fe237cd51baddf66eb8a0eaa8b1a82"} Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.531957 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f6c534e-f1a8-412b-9a62-df7cb562d938","Type":"ContainerStarted","Data":"d0e92b7a3d65142932174a5cd617354c7fc288838c0cce8c809500daddd1a18f"} Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.539418 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"726bfdd6-4d21-4d07-921b-ef9f28ff96c8","Type":"ContainerStarted","Data":"c23323013ecd18323321984eb91eaf18cf0e1a9c296675558182fd52116db3ab"} Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.569758 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-jwsg7" podStartSLOduration=2.569740614 podStartE2EDuration="2.569740614s" podCreationTimestamp="2026-01-30 21:36:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:36.537993984 +0000 UTC m=+1329.976630745" watchObservedRunningTime="2026-01-30 21:36:36.569740614 +0000 UTC m=+1330.008377375" Jan 30 21:36:36 crc kubenswrapper[4914]: I0130 21:36:36.854184 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rq48f"] Jan 30 21:36:37 crc kubenswrapper[4914]: I0130 21:36:37.566030 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"faaa496b-9482-4183-b22e-a1f0d78bb72b","Type":"ContainerStarted","Data":"bc5d394a9845f0e74f6583adb4f23955866c88f0bd5e3c8d42382030e46df7e3"} Jan 30 21:36:37 crc kubenswrapper[4914]: I0130 21:36:37.582766 4914 generic.go:334] "Generic (PLEG): container finished" podID="ba416ff3-6b9e-42c4-bf91-582be4df1ed1" containerID="2bc925bd362ee7effeda0ca668fbfbf0d9b1efb50278fac7833e2515aee512d2" exitCode=0 Jan 30 21:36:37 crc kubenswrapper[4914]: I0130 21:36:37.582828 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-jkq7z" event={"ID":"ba416ff3-6b9e-42c4-bf91-582be4df1ed1","Type":"ContainerDied","Data":"2bc925bd362ee7effeda0ca668fbfbf0d9b1efb50278fac7833e2515aee512d2"} Jan 30 21:36:37 crc kubenswrapper[4914]: I0130 21:36:37.582853 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-jkq7z" event={"ID":"ba416ff3-6b9e-42c4-bf91-582be4df1ed1","Type":"ContainerStarted","Data":"a388ee1ca194c1117d792ee610a5ec86f595666d6cacf1ed9819f63a5cc86e76"} Jan 30 21:36:37 crc kubenswrapper[4914]: I0130 21:36:37.606871 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rq48f" event={"ID":"aa0fec57-7f38-455f-85d0-47b90e552b48","Type":"ContainerStarted","Data":"900fcfdcc8d7e2ca1fb8bbb34b36214eafb90607ae9bb890b1d4505d60d8bf3d"} Jan 30 21:36:37 crc kubenswrapper[4914]: I0130 21:36:37.606922 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rq48f" event={"ID":"aa0fec57-7f38-455f-85d0-47b90e552b48","Type":"ContainerStarted","Data":"2d744745950f5fc5dd626af935969b4a8af7e9541c6dff4997433b0af8f8cf50"} Jan 30 21:36:37 crc kubenswrapper[4914]: I0130 21:36:37.635776 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-rq48f" podStartSLOduration=2.63575883 podStartE2EDuration="2.63575883s" podCreationTimestamp="2026-01-30 21:36:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:37.634362136 +0000 UTC m=+1331.072998887" watchObservedRunningTime="2026-01-30 21:36:37.63575883 +0000 UTC m=+1331.074395591" Jan 30 21:36:38 crc kubenswrapper[4914]: I0130 21:36:38.537596 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:38 crc kubenswrapper[4914]: I0130 21:36:38.624109 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-jkq7z" event={"ID":"ba416ff3-6b9e-42c4-bf91-582be4df1ed1","Type":"ContainerStarted","Data":"83bb224c73761434b974004f5d068b75a0e7edaf414ade541058e1feb0c0ee53"} Jan 30 21:36:38 crc kubenswrapper[4914]: I0130 21:36:38.624153 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:38 crc kubenswrapper[4914]: I0130 21:36:38.628482 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:36:38 crc kubenswrapper[4914]: I0130 21:36:38.663490 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cd565959-jkq7z" podStartSLOduration=4.663472268 podStartE2EDuration="4.663472268s" podCreationTimestamp="2026-01-30 21:36:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:38.657541875 +0000 UTC m=+1332.096178636" watchObservedRunningTime="2026-01-30 21:36:38.663472268 +0000 UTC m=+1332.102109029" Jan 30 21:36:41 crc kubenswrapper[4914]: I0130 21:36:41.662028 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"726bfdd6-4d21-4d07-921b-ef9f28ff96c8","Type":"ContainerStarted","Data":"1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14"} Jan 30 21:36:41 crc kubenswrapper[4914]: I0130 21:36:41.666063 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"faaa496b-9482-4183-b22e-a1f0d78bb72b","Type":"ContainerStarted","Data":"1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a"} Jan 30 21:36:41 crc kubenswrapper[4914]: I0130 21:36:41.666322 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"faaa496b-9482-4183-b22e-a1f0d78bb72b","Type":"ContainerStarted","Data":"f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb"} Jan 30 21:36:41 crc kubenswrapper[4914]: I0130 21:36:41.666170 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="faaa496b-9482-4183-b22e-a1f0d78bb72b" containerName="nova-metadata-metadata" containerID="cri-o://1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a" gracePeriod=30 Jan 30 21:36:41 crc kubenswrapper[4914]: I0130 21:36:41.666130 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="faaa496b-9482-4183-b22e-a1f0d78bb72b" containerName="nova-metadata-log" containerID="cri-o://f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb" gracePeriod=30 Jan 30 21:36:41 crc kubenswrapper[4914]: I0130 21:36:41.668152 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0","Type":"ContainerStarted","Data":"e71fc1513988061b645764815fbe699a0f4c556ff0363418c53f989c84361f40"} Jan 30 21:36:41 crc kubenswrapper[4914]: I0130 21:36:41.668242 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e71fc1513988061b645764815fbe699a0f4c556ff0363418c53f989c84361f40" gracePeriod=30 Jan 30 21:36:41 crc kubenswrapper[4914]: I0130 21:36:41.670258 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f6c534e-f1a8-412b-9a62-df7cb562d938","Type":"ContainerStarted","Data":"fe9c885745ef7fe51540c5ffb342ac8e577830574ad4cda231240be188218a99"} Jan 30 21:36:41 crc kubenswrapper[4914]: I0130 21:36:41.670287 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f6c534e-f1a8-412b-9a62-df7cb562d938","Type":"ContainerStarted","Data":"cea73fc16d5fcfb062e70d44f002e4f032bc6810ec3a727cbe9b9676346ceb64"} Jan 30 21:36:41 crc kubenswrapper[4914]: I0130 21:36:41.687236 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.48716677 podStartE2EDuration="7.687213636s" podCreationTimestamp="2026-01-30 21:36:34 +0000 UTC" firstStartedPulling="2026-01-30 21:36:36.245771022 +0000 UTC m=+1329.684407783" lastFinishedPulling="2026-01-30 21:36:40.445817888 +0000 UTC m=+1333.884454649" observedRunningTime="2026-01-30 21:36:41.676891859 +0000 UTC m=+1335.115528660" watchObservedRunningTime="2026-01-30 21:36:41.687213636 +0000 UTC m=+1335.125850407" Jan 30 21:36:41 crc kubenswrapper[4914]: I0130 21:36:41.719848 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.24219693 podStartE2EDuration="7.719826028s" podCreationTimestamp="2026-01-30 21:36:34 +0000 UTC" firstStartedPulling="2026-01-30 21:36:36.008359353 +0000 UTC m=+1329.446996114" lastFinishedPulling="2026-01-30 21:36:40.485988451 +0000 UTC m=+1333.924625212" observedRunningTime="2026-01-30 21:36:41.714851399 +0000 UTC m=+1335.153488170" watchObservedRunningTime="2026-01-30 21:36:41.719826028 +0000 UTC m=+1335.158462809" Jan 30 21:36:41 crc kubenswrapper[4914]: I0130 21:36:41.740900 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.690664595 podStartE2EDuration="7.740878122s" podCreationTimestamp="2026-01-30 21:36:34 +0000 UTC" firstStartedPulling="2026-01-30 21:36:36.421964233 +0000 UTC m=+1329.860600994" lastFinishedPulling="2026-01-30 21:36:40.47217775 +0000 UTC m=+1333.910814521" observedRunningTime="2026-01-30 21:36:41.73327605 +0000 UTC m=+1335.171912801" watchObservedRunningTime="2026-01-30 21:36:41.740878122 +0000 UTC m=+1335.179514883" Jan 30 21:36:41 crc kubenswrapper[4914]: I0130 21:36:41.758942 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.116705273 podStartE2EDuration="7.758912984s" podCreationTimestamp="2026-01-30 21:36:34 +0000 UTC" firstStartedPulling="2026-01-30 21:36:35.733063106 +0000 UTC m=+1329.171699867" lastFinishedPulling="2026-01-30 21:36:40.375270817 +0000 UTC m=+1333.813907578" observedRunningTime="2026-01-30 21:36:41.747991823 +0000 UTC m=+1335.186628584" watchObservedRunningTime="2026-01-30 21:36:41.758912984 +0000 UTC m=+1335.197549765" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.364165 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.467339 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wrw2\" (UniqueName: \"kubernetes.io/projected/faaa496b-9482-4183-b22e-a1f0d78bb72b-kube-api-access-7wrw2\") pod \"faaa496b-9482-4183-b22e-a1f0d78bb72b\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.467522 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faaa496b-9482-4183-b22e-a1f0d78bb72b-logs\") pod \"faaa496b-9482-4183-b22e-a1f0d78bb72b\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.467654 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faaa496b-9482-4183-b22e-a1f0d78bb72b-config-data\") pod \"faaa496b-9482-4183-b22e-a1f0d78bb72b\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.467678 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faaa496b-9482-4183-b22e-a1f0d78bb72b-combined-ca-bundle\") pod \"faaa496b-9482-4183-b22e-a1f0d78bb72b\" (UID: \"faaa496b-9482-4183-b22e-a1f0d78bb72b\") " Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.467922 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faaa496b-9482-4183-b22e-a1f0d78bb72b-logs" (OuterVolumeSpecName: "logs") pod "faaa496b-9482-4183-b22e-a1f0d78bb72b" (UID: "faaa496b-9482-4183-b22e-a1f0d78bb72b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.468218 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faaa496b-9482-4183-b22e-a1f0d78bb72b-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.473850 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faaa496b-9482-4183-b22e-a1f0d78bb72b-kube-api-access-7wrw2" (OuterVolumeSpecName: "kube-api-access-7wrw2") pod "faaa496b-9482-4183-b22e-a1f0d78bb72b" (UID: "faaa496b-9482-4183-b22e-a1f0d78bb72b"). InnerVolumeSpecName "kube-api-access-7wrw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.501034 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faaa496b-9482-4183-b22e-a1f0d78bb72b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faaa496b-9482-4183-b22e-a1f0d78bb72b" (UID: "faaa496b-9482-4183-b22e-a1f0d78bb72b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.502204 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faaa496b-9482-4183-b22e-a1f0d78bb72b-config-data" (OuterVolumeSpecName: "config-data") pod "faaa496b-9482-4183-b22e-a1f0d78bb72b" (UID: "faaa496b-9482-4183-b22e-a1f0d78bb72b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.571069 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faaa496b-9482-4183-b22e-a1f0d78bb72b-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.571134 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faaa496b-9482-4183-b22e-a1f0d78bb72b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.571151 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wrw2\" (UniqueName: \"kubernetes.io/projected/faaa496b-9482-4183-b22e-a1f0d78bb72b-kube-api-access-7wrw2\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.680438 4914 generic.go:334] "Generic (PLEG): container finished" podID="faaa496b-9482-4183-b22e-a1f0d78bb72b" containerID="1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a" exitCode=0 Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.680473 4914 generic.go:334] "Generic (PLEG): container finished" podID="faaa496b-9482-4183-b22e-a1f0d78bb72b" containerID="f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb" exitCode=143 Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.680562 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"faaa496b-9482-4183-b22e-a1f0d78bb72b","Type":"ContainerDied","Data":"1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a"} Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.680619 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"faaa496b-9482-4183-b22e-a1f0d78bb72b","Type":"ContainerDied","Data":"f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb"} Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.680636 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"faaa496b-9482-4183-b22e-a1f0d78bb72b","Type":"ContainerDied","Data":"bc5d394a9845f0e74f6583adb4f23955866c88f0bd5e3c8d42382030e46df7e3"} Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.680658 4914 scope.go:117] "RemoveContainer" containerID="1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.681111 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.712453 4914 scope.go:117] "RemoveContainer" containerID="f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.739857 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.755661 4914 scope.go:117] "RemoveContainer" containerID="1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a" Jan 30 21:36:42 crc kubenswrapper[4914]: E0130 21:36:42.756527 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a\": container with ID starting with 1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a not found: ID does not exist" containerID="1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.756565 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a"} err="failed to get container status \"1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a\": rpc error: code = NotFound desc = could not find container \"1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a\": container with ID starting with 1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a not found: ID does not exist" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.756594 4914 scope.go:117] "RemoveContainer" containerID="f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb" Jan 30 21:36:42 crc kubenswrapper[4914]: E0130 21:36:42.760484 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb\": container with ID starting with f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb not found: ID does not exist" containerID="f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.760537 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb"} err="failed to get container status \"f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb\": rpc error: code = NotFound desc = could not find container \"f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb\": container with ID starting with f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb not found: ID does not exist" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.760558 4914 scope.go:117] "RemoveContainer" containerID="1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.760905 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a"} err="failed to get container status \"1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a\": rpc error: code = NotFound desc = could not find container \"1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a\": container with ID starting with 1d9f261e5d1ab4484e49c629ecab98b5205777c6728c65291443408e4a8f838a not found: ID does not exist" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.760995 4914 scope.go:117] "RemoveContainer" containerID="f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.761273 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb"} err="failed to get container status \"f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb\": rpc error: code = NotFound desc = could not find container \"f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb\": container with ID starting with f094d59cde2706f037897d12dfb240a99c7064f0ef9019bbec04a1fe33cd18bb not found: ID does not exist" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.765880 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.780672 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:42 crc kubenswrapper[4914]: E0130 21:36:42.781166 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faaa496b-9482-4183-b22e-a1f0d78bb72b" containerName="nova-metadata-metadata" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.781188 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="faaa496b-9482-4183-b22e-a1f0d78bb72b" containerName="nova-metadata-metadata" Jan 30 21:36:42 crc kubenswrapper[4914]: E0130 21:36:42.781216 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faaa496b-9482-4183-b22e-a1f0d78bb72b" containerName="nova-metadata-log" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.781223 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="faaa496b-9482-4183-b22e-a1f0d78bb72b" containerName="nova-metadata-log" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.781424 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="faaa496b-9482-4183-b22e-a1f0d78bb72b" containerName="nova-metadata-log" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.781462 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="faaa496b-9482-4183-b22e-a1f0d78bb72b" containerName="nova-metadata-metadata" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.782553 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.785972 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.800555 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.802717 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.879440 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.879726 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-config-data\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.879900 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb5f6\" (UniqueName: \"kubernetes.io/projected/3693b53a-8476-4717-bb56-d5d54f287bb7-kube-api-access-pb5f6\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.880089 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3693b53a-8476-4717-bb56-d5d54f287bb7-logs\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.880351 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.982028 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.982108 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.982135 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-config-data\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.982166 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb5f6\" (UniqueName: \"kubernetes.io/projected/3693b53a-8476-4717-bb56-d5d54f287bb7-kube-api-access-pb5f6\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.982209 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3693b53a-8476-4717-bb56-d5d54f287bb7-logs\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.983317 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3693b53a-8476-4717-bb56-d5d54f287bb7-logs\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:42 crc kubenswrapper[4914]: I0130 21:36:42.989884 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-config-data\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:43 crc kubenswrapper[4914]: I0130 21:36:43.000162 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:43 crc kubenswrapper[4914]: I0130 21:36:43.008739 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:43 crc kubenswrapper[4914]: I0130 21:36:43.011580 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb5f6\" (UniqueName: \"kubernetes.io/projected/3693b53a-8476-4717-bb56-d5d54f287bb7-kube-api-access-pb5f6\") pod \"nova-metadata-0\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " pod="openstack/nova-metadata-0" Jan 30 21:36:43 crc kubenswrapper[4914]: I0130 21:36:43.112154 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:36:43 crc kubenswrapper[4914]: I0130 21:36:43.566243 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:43 crc kubenswrapper[4914]: W0130 21:36:43.574528 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3693b53a_8476_4717_bb56_d5d54f287bb7.slice/crio-40a29f6f6a6629a7148216520db6637d940d218f8f7cad682a82d14c1664ab62 WatchSource:0}: Error finding container 40a29f6f6a6629a7148216520db6637d940d218f8f7cad682a82d14c1664ab62: Status 404 returned error can't find the container with id 40a29f6f6a6629a7148216520db6637d940d218f8f7cad682a82d14c1664ab62 Jan 30 21:36:43 crc kubenswrapper[4914]: I0130 21:36:43.701418 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3693b53a-8476-4717-bb56-d5d54f287bb7","Type":"ContainerStarted","Data":"40a29f6f6a6629a7148216520db6637d940d218f8f7cad682a82d14c1664ab62"} Jan 30 21:36:43 crc kubenswrapper[4914]: I0130 21:36:43.843659 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faaa496b-9482-4183-b22e-a1f0d78bb72b" path="/var/lib/kubelet/pods/faaa496b-9482-4183-b22e-a1f0d78bb72b/volumes" Jan 30 21:36:44 crc kubenswrapper[4914]: I0130 21:36:44.717131 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3693b53a-8476-4717-bb56-d5d54f287bb7","Type":"ContainerStarted","Data":"a44a377237e2b5378abe1b68ea09df3c5dd319c91785a1ed5b7944d2c7cd33a5"} Jan 30 21:36:44 crc kubenswrapper[4914]: I0130 21:36:44.717477 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3693b53a-8476-4717-bb56-d5d54f287bb7","Type":"ContainerStarted","Data":"cc62902625579326134a189c7129a2d06e13a563b27da6d652ec671fb89a77f5"} Jan 30 21:36:44 crc kubenswrapper[4914]: I0130 21:36:44.720546 4914 generic.go:334] "Generic (PLEG): container finished" podID="53f1479b-3e35-4ffb-81ca-4bc42fb0d36b" containerID="04f6826f8e64a5a3980bb9d298190a1a13ef171bbe8a077a3f54e49b7502d7c0" exitCode=0 Jan 30 21:36:44 crc kubenswrapper[4914]: I0130 21:36:44.720616 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jwsg7" event={"ID":"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b","Type":"ContainerDied","Data":"04f6826f8e64a5a3980bb9d298190a1a13ef171bbe8a077a3f54e49b7502d7c0"} Jan 30 21:36:44 crc kubenswrapper[4914]: I0130 21:36:44.740273 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.740247877 podStartE2EDuration="2.740247877s" podCreationTimestamp="2026-01-30 21:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:44.736267741 +0000 UTC m=+1338.174904532" watchObservedRunningTime="2026-01-30 21:36:44.740247877 +0000 UTC m=+1338.178884638" Jan 30 21:36:44 crc kubenswrapper[4914]: I0130 21:36:44.997649 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.030628 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.030685 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.045615 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.045665 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.081056 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.268796 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.342115 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-spmt5"] Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.342392 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67bdc55879-spmt5" podUID="f7b96580-046d-4a22-b866-78dd55234c0a" containerName="dnsmasq-dns" containerID="cri-o://7226b109ee0610df021d7b028718ba70f315cd24195d3bf0b5a1738ea52be53b" gracePeriod=10 Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.729932 4914 generic.go:334] "Generic (PLEG): container finished" podID="aa0fec57-7f38-455f-85d0-47b90e552b48" containerID="900fcfdcc8d7e2ca1fb8bbb34b36214eafb90607ae9bb890b1d4505d60d8bf3d" exitCode=0 Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.730302 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rq48f" event={"ID":"aa0fec57-7f38-455f-85d0-47b90e552b48","Type":"ContainerDied","Data":"900fcfdcc8d7e2ca1fb8bbb34b36214eafb90607ae9bb890b1d4505d60d8bf3d"} Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.731952 4914 generic.go:334] "Generic (PLEG): container finished" podID="f7b96580-046d-4a22-b866-78dd55234c0a" containerID="7226b109ee0610df021d7b028718ba70f315cd24195d3bf0b5a1738ea52be53b" exitCode=0 Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.732373 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-spmt5" event={"ID":"f7b96580-046d-4a22-b866-78dd55234c0a","Type":"ContainerDied","Data":"7226b109ee0610df021d7b028718ba70f315cd24195d3bf0b5a1738ea52be53b"} Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.732404 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-spmt5" event={"ID":"f7b96580-046d-4a22-b866-78dd55234c0a","Type":"ContainerDied","Data":"93a1d1af276d6d1b71136f60aef2e47570670954af733b34885998c3f1f31a06"} Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.732421 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93a1d1af276d6d1b71136f60aef2e47570670954af733b34885998c3f1f31a06" Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.766955 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.841978 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.952479 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-dns-swift-storage-0\") pod \"f7b96580-046d-4a22-b866-78dd55234c0a\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.952529 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-ovsdbserver-nb\") pod \"f7b96580-046d-4a22-b866-78dd55234c0a\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.952667 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47sjm\" (UniqueName: \"kubernetes.io/projected/f7b96580-046d-4a22-b866-78dd55234c0a-kube-api-access-47sjm\") pod \"f7b96580-046d-4a22-b866-78dd55234c0a\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.952772 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-dns-svc\") pod \"f7b96580-046d-4a22-b866-78dd55234c0a\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.952800 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-config\") pod \"f7b96580-046d-4a22-b866-78dd55234c0a\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " Jan 30 21:36:45 crc kubenswrapper[4914]: I0130 21:36:45.952900 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-ovsdbserver-sb\") pod \"f7b96580-046d-4a22-b866-78dd55234c0a\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:45.999927 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b96580-046d-4a22-b866-78dd55234c0a-kube-api-access-47sjm" (OuterVolumeSpecName: "kube-api-access-47sjm") pod "f7b96580-046d-4a22-b866-78dd55234c0a" (UID: "f7b96580-046d-4a22-b866-78dd55234c0a"). InnerVolumeSpecName "kube-api-access-47sjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.038446 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f7b96580-046d-4a22-b866-78dd55234c0a" (UID: "f7b96580-046d-4a22-b866-78dd55234c0a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.055453 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-config" (OuterVolumeSpecName: "config") pod "f7b96580-046d-4a22-b866-78dd55234c0a" (UID: "f7b96580-046d-4a22-b866-78dd55234c0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.056884 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f7b96580-046d-4a22-b866-78dd55234c0a" (UID: "f7b96580-046d-4a22-b866-78dd55234c0a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.056989 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-ovsdbserver-nb\") pod \"f7b96580-046d-4a22-b866-78dd55234c0a\" (UID: \"f7b96580-046d-4a22-b866-78dd55234c0a\") " Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.057461 4914 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.057473 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47sjm\" (UniqueName: \"kubernetes.io/projected/f7b96580-046d-4a22-b866-78dd55234c0a-kube-api-access-47sjm\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.057483 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:46 crc kubenswrapper[4914]: W0130 21:36:46.057544 4914 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f7b96580-046d-4a22-b866-78dd55234c0a/volumes/kubernetes.io~configmap/ovsdbserver-nb Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.057553 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f7b96580-046d-4a22-b866-78dd55234c0a" (UID: "f7b96580-046d-4a22-b866-78dd55234c0a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.070244 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7b96580-046d-4a22-b866-78dd55234c0a" (UID: "f7b96580-046d-4a22-b866-78dd55234c0a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.079440 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f7b96580-046d-4a22-b866-78dd55234c0a" (UID: "f7b96580-046d-4a22-b866-78dd55234c0a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.111905 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3f6c534e-f1a8-412b-9a62-df7cb562d938" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.112253 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3f6c534e-f1a8-412b-9a62-df7cb562d938" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.126226 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.163093 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.163135 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.163144 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7b96580-046d-4a22-b866-78dd55234c0a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.266533 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-config-data\") pod \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.266819 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q92pf\" (UniqueName: \"kubernetes.io/projected/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-kube-api-access-q92pf\") pod \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.267020 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-scripts\") pod \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.267107 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-combined-ca-bundle\") pod \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\" (UID: \"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b\") " Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.271886 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-kube-api-access-q92pf" (OuterVolumeSpecName: "kube-api-access-q92pf") pod "53f1479b-3e35-4ffb-81ca-4bc42fb0d36b" (UID: "53f1479b-3e35-4ffb-81ca-4bc42fb0d36b"). InnerVolumeSpecName "kube-api-access-q92pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.276965 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-scripts" (OuterVolumeSpecName: "scripts") pod "53f1479b-3e35-4ffb-81ca-4bc42fb0d36b" (UID: "53f1479b-3e35-4ffb-81ca-4bc42fb0d36b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.293492 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53f1479b-3e35-4ffb-81ca-4bc42fb0d36b" (UID: "53f1479b-3e35-4ffb-81ca-4bc42fb0d36b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.306991 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-config-data" (OuterVolumeSpecName: "config-data") pod "53f1479b-3e35-4ffb-81ca-4bc42fb0d36b" (UID: "53f1479b-3e35-4ffb-81ca-4bc42fb0d36b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.370122 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q92pf\" (UniqueName: \"kubernetes.io/projected/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-kube-api-access-q92pf\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.370157 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.370170 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.370181 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.758911 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jwsg7" event={"ID":"53f1479b-3e35-4ffb-81ca-4bc42fb0d36b","Type":"ContainerDied","Data":"a518b3708bedbb42dca26157181173f83c9c07bf5cb6cb160890e801732ed16f"} Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.758973 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a518b3708bedbb42dca26157181173f83c9c07bf5cb6cb160890e801732ed16f" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.759083 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jwsg7" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.760608 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-spmt5" Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.840180 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-spmt5"] Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.851651 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-spmt5"] Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.923736 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.923989 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3f6c534e-f1a8-412b-9a62-df7cb562d938" containerName="nova-api-log" containerID="cri-o://cea73fc16d5fcfb062e70d44f002e4f032bc6810ec3a727cbe9b9676346ceb64" gracePeriod=30 Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.924153 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3f6c534e-f1a8-412b-9a62-df7cb562d938" containerName="nova-api-api" containerID="cri-o://fe9c885745ef7fe51540c5ffb342ac8e577830574ad4cda231240be188218a99" gracePeriod=30 Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.965088 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.973882 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.974085 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3693b53a-8476-4717-bb56-d5d54f287bb7" containerName="nova-metadata-log" containerID="cri-o://cc62902625579326134a189c7129a2d06e13a563b27da6d652ec671fb89a77f5" gracePeriod=30 Jan 30 21:36:46 crc kubenswrapper[4914]: I0130 21:36:46.974328 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3693b53a-8476-4717-bb56-d5d54f287bb7" containerName="nova-metadata-metadata" containerID="cri-o://a44a377237e2b5378abe1b68ea09df3c5dd319c91785a1ed5b7944d2c7cd33a5" gracePeriod=30 Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.337102 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.395305 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhrrr\" (UniqueName: \"kubernetes.io/projected/aa0fec57-7f38-455f-85d0-47b90e552b48-kube-api-access-dhrrr\") pod \"aa0fec57-7f38-455f-85d0-47b90e552b48\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.395357 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-combined-ca-bundle\") pod \"aa0fec57-7f38-455f-85d0-47b90e552b48\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.395379 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-config-data\") pod \"aa0fec57-7f38-455f-85d0-47b90e552b48\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.395458 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-scripts\") pod \"aa0fec57-7f38-455f-85d0-47b90e552b48\" (UID: \"aa0fec57-7f38-455f-85d0-47b90e552b48\") " Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.403128 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa0fec57-7f38-455f-85d0-47b90e552b48-kube-api-access-dhrrr" (OuterVolumeSpecName: "kube-api-access-dhrrr") pod "aa0fec57-7f38-455f-85d0-47b90e552b48" (UID: "aa0fec57-7f38-455f-85d0-47b90e552b48"). InnerVolumeSpecName "kube-api-access-dhrrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.403146 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-scripts" (OuterVolumeSpecName: "scripts") pod "aa0fec57-7f38-455f-85d0-47b90e552b48" (UID: "aa0fec57-7f38-455f-85d0-47b90e552b48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.431955 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-config-data" (OuterVolumeSpecName: "config-data") pod "aa0fec57-7f38-455f-85d0-47b90e552b48" (UID: "aa0fec57-7f38-455f-85d0-47b90e552b48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.434776 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa0fec57-7f38-455f-85d0-47b90e552b48" (UID: "aa0fec57-7f38-455f-85d0-47b90e552b48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.498470 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhrrr\" (UniqueName: \"kubernetes.io/projected/aa0fec57-7f38-455f-85d0-47b90e552b48-kube-api-access-dhrrr\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.498514 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.498525 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.498535 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0fec57-7f38-455f-85d0-47b90e552b48-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.804064 4914 generic.go:334] "Generic (PLEG): container finished" podID="3f6c534e-f1a8-412b-9a62-df7cb562d938" containerID="cea73fc16d5fcfb062e70d44f002e4f032bc6810ec3a727cbe9b9676346ceb64" exitCode=143 Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.804133 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f6c534e-f1a8-412b-9a62-df7cb562d938","Type":"ContainerDied","Data":"cea73fc16d5fcfb062e70d44f002e4f032bc6810ec3a727cbe9b9676346ceb64"} Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.809633 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rq48f" event={"ID":"aa0fec57-7f38-455f-85d0-47b90e552b48","Type":"ContainerDied","Data":"2d744745950f5fc5dd626af935969b4a8af7e9541c6dff4997433b0af8f8cf50"} Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.809681 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d744745950f5fc5dd626af935969b4a8af7e9541c6dff4997433b0af8f8cf50" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.809695 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rq48f" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.832139 4914 generic.go:334] "Generic (PLEG): container finished" podID="3693b53a-8476-4717-bb56-d5d54f287bb7" containerID="a44a377237e2b5378abe1b68ea09df3c5dd319c91785a1ed5b7944d2c7cd33a5" exitCode=0 Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.832168 4914 generic.go:334] "Generic (PLEG): container finished" podID="3693b53a-8476-4717-bb56-d5d54f287bb7" containerID="cc62902625579326134a189c7129a2d06e13a563b27da6d652ec671fb89a77f5" exitCode=143 Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.832314 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="726bfdd6-4d21-4d07-921b-ef9f28ff96c8" containerName="nova-scheduler-scheduler" containerID="cri-o://1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14" gracePeriod=30 Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.850151 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7b96580-046d-4a22-b866-78dd55234c0a" path="/var/lib/kubelet/pods/f7b96580-046d-4a22-b866-78dd55234c0a/volumes" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.852128 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3693b53a-8476-4717-bb56-d5d54f287bb7","Type":"ContainerDied","Data":"a44a377237e2b5378abe1b68ea09df3c5dd319c91785a1ed5b7944d2c7cd33a5"} Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.852176 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3693b53a-8476-4717-bb56-d5d54f287bb7","Type":"ContainerDied","Data":"cc62902625579326134a189c7129a2d06e13a563b27da6d652ec671fb89a77f5"} Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.904303 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 21:36:47 crc kubenswrapper[4914]: E0130 21:36:47.904737 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b96580-046d-4a22-b866-78dd55234c0a" containerName="init" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.904754 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b96580-046d-4a22-b866-78dd55234c0a" containerName="init" Jan 30 21:36:47 crc kubenswrapper[4914]: E0130 21:36:47.904771 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b96580-046d-4a22-b866-78dd55234c0a" containerName="dnsmasq-dns" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.904778 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b96580-046d-4a22-b866-78dd55234c0a" containerName="dnsmasq-dns" Jan 30 21:36:47 crc kubenswrapper[4914]: E0130 21:36:47.904798 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f1479b-3e35-4ffb-81ca-4bc42fb0d36b" containerName="nova-manage" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.904804 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f1479b-3e35-4ffb-81ca-4bc42fb0d36b" containerName="nova-manage" Jan 30 21:36:47 crc kubenswrapper[4914]: E0130 21:36:47.904817 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0fec57-7f38-455f-85d0-47b90e552b48" containerName="nova-cell1-conductor-db-sync" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.904823 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0fec57-7f38-455f-85d0-47b90e552b48" containerName="nova-cell1-conductor-db-sync" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.905007 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f1479b-3e35-4ffb-81ca-4bc42fb0d36b" containerName="nova-manage" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.905021 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa0fec57-7f38-455f-85d0-47b90e552b48" containerName="nova-cell1-conductor-db-sync" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.905033 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7b96580-046d-4a22-b866-78dd55234c0a" containerName="dnsmasq-dns" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.905768 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.909254 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 21:36:47 crc kubenswrapper[4914]: I0130 21:36:47.909894 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.008812 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58rhw\" (UniqueName: \"kubernetes.io/projected/95ce4f8e-21fa-41f4-a300-a1cff7594ce3-kube-api-access-58rhw\") pod \"nova-cell1-conductor-0\" (UID: \"95ce4f8e-21fa-41f4-a300-a1cff7594ce3\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.008929 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ce4f8e-21fa-41f4-a300-a1cff7594ce3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"95ce4f8e-21fa-41f4-a300-a1cff7594ce3\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.009066 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ce4f8e-21fa-41f4-a300-a1cff7594ce3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"95ce4f8e-21fa-41f4-a300-a1cff7594ce3\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.111826 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58rhw\" (UniqueName: \"kubernetes.io/projected/95ce4f8e-21fa-41f4-a300-a1cff7594ce3-kube-api-access-58rhw\") pod \"nova-cell1-conductor-0\" (UID: \"95ce4f8e-21fa-41f4-a300-a1cff7594ce3\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.111892 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ce4f8e-21fa-41f4-a300-a1cff7594ce3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"95ce4f8e-21fa-41f4-a300-a1cff7594ce3\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.111995 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ce4f8e-21fa-41f4-a300-a1cff7594ce3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"95ce4f8e-21fa-41f4-a300-a1cff7594ce3\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.112292 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.112336 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.117338 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ce4f8e-21fa-41f4-a300-a1cff7594ce3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"95ce4f8e-21fa-41f4-a300-a1cff7594ce3\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.117402 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ce4f8e-21fa-41f4-a300-a1cff7594ce3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"95ce4f8e-21fa-41f4-a300-a1cff7594ce3\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.136364 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58rhw\" (UniqueName: \"kubernetes.io/projected/95ce4f8e-21fa-41f4-a300-a1cff7594ce3-kube-api-access-58rhw\") pod \"nova-cell1-conductor-0\" (UID: \"95ce4f8e-21fa-41f4-a300-a1cff7594ce3\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.223038 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.232835 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.317007 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3693b53a-8476-4717-bb56-d5d54f287bb7-logs\") pod \"3693b53a-8476-4717-bb56-d5d54f287bb7\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.317248 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb5f6\" (UniqueName: \"kubernetes.io/projected/3693b53a-8476-4717-bb56-d5d54f287bb7-kube-api-access-pb5f6\") pod \"3693b53a-8476-4717-bb56-d5d54f287bb7\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.317277 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-combined-ca-bundle\") pod \"3693b53a-8476-4717-bb56-d5d54f287bb7\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.317338 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-nova-metadata-tls-certs\") pod \"3693b53a-8476-4717-bb56-d5d54f287bb7\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.317438 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-config-data\") pod \"3693b53a-8476-4717-bb56-d5d54f287bb7\" (UID: \"3693b53a-8476-4717-bb56-d5d54f287bb7\") " Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.324244 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3693b53a-8476-4717-bb56-d5d54f287bb7-kube-api-access-pb5f6" (OuterVolumeSpecName: "kube-api-access-pb5f6") pod "3693b53a-8476-4717-bb56-d5d54f287bb7" (UID: "3693b53a-8476-4717-bb56-d5d54f287bb7"). InnerVolumeSpecName "kube-api-access-pb5f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.324769 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3693b53a-8476-4717-bb56-d5d54f287bb7-logs" (OuterVolumeSpecName: "logs") pod "3693b53a-8476-4717-bb56-d5d54f287bb7" (UID: "3693b53a-8476-4717-bb56-d5d54f287bb7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.353099 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-config-data" (OuterVolumeSpecName: "config-data") pod "3693b53a-8476-4717-bb56-d5d54f287bb7" (UID: "3693b53a-8476-4717-bb56-d5d54f287bb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.363390 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3693b53a-8476-4717-bb56-d5d54f287bb7" (UID: "3693b53a-8476-4717-bb56-d5d54f287bb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.409366 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3693b53a-8476-4717-bb56-d5d54f287bb7" (UID: "3693b53a-8476-4717-bb56-d5d54f287bb7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.419854 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3693b53a-8476-4717-bb56-d5d54f287bb7-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.420251 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb5f6\" (UniqueName: \"kubernetes.io/projected/3693b53a-8476-4717-bb56-d5d54f287bb7-kube-api-access-pb5f6\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.420415 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.420520 4914 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.420579 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3693b53a-8476-4717-bb56-d5d54f287bb7-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.664193 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.845338 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3693b53a-8476-4717-bb56-d5d54f287bb7","Type":"ContainerDied","Data":"40a29f6f6a6629a7148216520db6637d940d218f8f7cad682a82d14c1664ab62"} Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.845756 4914 scope.go:117] "RemoveContainer" containerID="a44a377237e2b5378abe1b68ea09df3c5dd319c91785a1ed5b7944d2c7cd33a5" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.845367 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.847863 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"95ce4f8e-21fa-41f4-a300-a1cff7594ce3","Type":"ContainerStarted","Data":"b2d1a240dc5734f901f811607e5b0a24b90c37d7adfa1aa00ed832a3ba4c85c8"} Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.881077 4914 scope.go:117] "RemoveContainer" containerID="cc62902625579326134a189c7129a2d06e13a563b27da6d652ec671fb89a77f5" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.899796 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.933981 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.943869 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:48 crc kubenswrapper[4914]: E0130 21:36:48.944506 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3693b53a-8476-4717-bb56-d5d54f287bb7" containerName="nova-metadata-metadata" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.944537 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3693b53a-8476-4717-bb56-d5d54f287bb7" containerName="nova-metadata-metadata" Jan 30 21:36:48 crc kubenswrapper[4914]: E0130 21:36:48.944610 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3693b53a-8476-4717-bb56-d5d54f287bb7" containerName="nova-metadata-log" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.944632 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3693b53a-8476-4717-bb56-d5d54f287bb7" containerName="nova-metadata-log" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.944980 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3693b53a-8476-4717-bb56-d5d54f287bb7" containerName="nova-metadata-log" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.945017 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3693b53a-8476-4717-bb56-d5d54f287bb7" containerName="nova-metadata-metadata" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.946978 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.950276 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.951016 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 21:36:48 crc kubenswrapper[4914]: I0130 21:36:48.959185 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.034825 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.034985 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6009160e-a137-406a-9993-ce86e2236110-logs\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.035061 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-config-data\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.035243 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9qhz\" (UniqueName: \"kubernetes.io/projected/6009160e-a137-406a-9993-ce86e2236110-kube-api-access-q9qhz\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.035392 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.137950 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qhz\" (UniqueName: \"kubernetes.io/projected/6009160e-a137-406a-9993-ce86e2236110-kube-api-access-q9qhz\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.138288 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.139129 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.139196 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6009160e-a137-406a-9993-ce86e2236110-logs\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.139224 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-config-data\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.140025 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6009160e-a137-406a-9993-ce86e2236110-logs\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.144549 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.145059 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.145192 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-config-data\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.168454 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9qhz\" (UniqueName: \"kubernetes.io/projected/6009160e-a137-406a-9993-ce86e2236110-kube-api-access-q9qhz\") pod \"nova-metadata-0\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.270871 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:36:49 crc kubenswrapper[4914]: W0130 21:36:49.765207 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6009160e_a137_406a_9993_ce86e2236110.slice/crio-2f8f319e2c738bdeeead6bf9429cef1d0d392a9f68b34cd51da48e80b2e877ee WatchSource:0}: Error finding container 2f8f319e2c738bdeeead6bf9429cef1d0d392a9f68b34cd51da48e80b2e877ee: Status 404 returned error can't find the container with id 2f8f319e2c738bdeeead6bf9429cef1d0d392a9f68b34cd51da48e80b2e877ee Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.773846 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.835050 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3693b53a-8476-4717-bb56-d5d54f287bb7" path="/var/lib/kubelet/pods/3693b53a-8476-4717-bb56-d5d54f287bb7/volumes" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.860538 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"95ce4f8e-21fa-41f4-a300-a1cff7594ce3","Type":"ContainerStarted","Data":"9b97ca10c7b2d36cc9bb2016a7147ee3de7df1a853ca626a71afa0542d388b82"} Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.860618 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.865161 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6009160e-a137-406a-9993-ce86e2236110","Type":"ContainerStarted","Data":"2f8f319e2c738bdeeead6bf9429cef1d0d392a9f68b34cd51da48e80b2e877ee"} Jan 30 21:36:49 crc kubenswrapper[4914]: I0130 21:36:49.881851 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.881836036 podStartE2EDuration="2.881836036s" podCreationTimestamp="2026-01-30 21:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:49.877749898 +0000 UTC m=+1343.316386659" watchObservedRunningTime="2026-01-30 21:36:49.881836036 +0000 UTC m=+1343.320472787" Jan 30 21:36:50 crc kubenswrapper[4914]: E0130 21:36:50.060695 4914 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 21:36:50 crc kubenswrapper[4914]: E0130 21:36:50.063929 4914 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 21:36:50 crc kubenswrapper[4914]: E0130 21:36:50.065205 4914 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 21:36:50 crc kubenswrapper[4914]: E0130 21:36:50.065363 4914 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="726bfdd6-4d21-4d07-921b-ef9f28ff96c8" containerName="nova-scheduler-scheduler" Jan 30 21:36:50 crc kubenswrapper[4914]: I0130 21:36:50.876822 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6009160e-a137-406a-9993-ce86e2236110","Type":"ContainerStarted","Data":"8d450d88b86419d8b0697a01ac8e68ed82276acc2ce676d606b984494f77c96a"} Jan 30 21:36:50 crc kubenswrapper[4914]: I0130 21:36:50.877188 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6009160e-a137-406a-9993-ce86e2236110","Type":"ContainerStarted","Data":"f1fb98fe4f62ea81fa973b4c031a0dd2f70fe282ea0ccd01772ab33e5481b60d"} Jan 30 21:36:50 crc kubenswrapper[4914]: I0130 21:36:50.897420 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.897394492 podStartE2EDuration="2.897394492s" podCreationTimestamp="2026-01-30 21:36:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:50.892874813 +0000 UTC m=+1344.331511594" watchObservedRunningTime="2026-01-30 21:36:50.897394492 +0000 UTC m=+1344.336031273" Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.796851 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.892353 4914 generic.go:334] "Generic (PLEG): container finished" podID="3f6c534e-f1a8-412b-9a62-df7cb562d938" containerID="fe9c885745ef7fe51540c5ffb342ac8e577830574ad4cda231240be188218a99" exitCode=0 Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.892919 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f6c534e-f1a8-412b-9a62-df7cb562d938","Type":"ContainerDied","Data":"fe9c885745ef7fe51540c5ffb342ac8e577830574ad4cda231240be188218a99"} Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.892964 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f6c534e-f1a8-412b-9a62-df7cb562d938","Type":"ContainerDied","Data":"d0e92b7a3d65142932174a5cd617354c7fc288838c0cce8c809500daddd1a18f"} Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.892983 4914 scope.go:117] "RemoveContainer" containerID="fe9c885745ef7fe51540c5ffb342ac8e577830574ad4cda231240be188218a99" Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.892926 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.909012 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6c534e-f1a8-412b-9a62-df7cb562d938-logs\") pod \"3f6c534e-f1a8-412b-9a62-df7cb562d938\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.909271 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6c534e-f1a8-412b-9a62-df7cb562d938-combined-ca-bundle\") pod \"3f6c534e-f1a8-412b-9a62-df7cb562d938\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.909364 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6c534e-f1a8-412b-9a62-df7cb562d938-config-data\") pod \"3f6c534e-f1a8-412b-9a62-df7cb562d938\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.909417 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bht6z\" (UniqueName: \"kubernetes.io/projected/3f6c534e-f1a8-412b-9a62-df7cb562d938-kube-api-access-bht6z\") pod \"3f6c534e-f1a8-412b-9a62-df7cb562d938\" (UID: \"3f6c534e-f1a8-412b-9a62-df7cb562d938\") " Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.910851 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f6c534e-f1a8-412b-9a62-df7cb562d938-logs" (OuterVolumeSpecName: "logs") pod "3f6c534e-f1a8-412b-9a62-df7cb562d938" (UID: "3f6c534e-f1a8-412b-9a62-df7cb562d938"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.914743 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f6c534e-f1a8-412b-9a62-df7cb562d938-kube-api-access-bht6z" (OuterVolumeSpecName: "kube-api-access-bht6z") pod "3f6c534e-f1a8-412b-9a62-df7cb562d938" (UID: "3f6c534e-f1a8-412b-9a62-df7cb562d938"). InnerVolumeSpecName "kube-api-access-bht6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.918983 4914 scope.go:117] "RemoveContainer" containerID="cea73fc16d5fcfb062e70d44f002e4f032bc6810ec3a727cbe9b9676346ceb64" Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.937597 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f6c534e-f1a8-412b-9a62-df7cb562d938-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f6c534e-f1a8-412b-9a62-df7cb562d938" (UID: "3f6c534e-f1a8-412b-9a62-df7cb562d938"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:51 crc kubenswrapper[4914]: I0130 21:36:51.941326 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f6c534e-f1a8-412b-9a62-df7cb562d938-config-data" (OuterVolumeSpecName: "config-data") pod "3f6c534e-f1a8-412b-9a62-df7cb562d938" (UID: "3f6c534e-f1a8-412b-9a62-df7cb562d938"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.013578 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6c534e-f1a8-412b-9a62-df7cb562d938-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.013622 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6c534e-f1a8-412b-9a62-df7cb562d938-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.013641 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bht6z\" (UniqueName: \"kubernetes.io/projected/3f6c534e-f1a8-412b-9a62-df7cb562d938-kube-api-access-bht6z\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.013662 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6c534e-f1a8-412b-9a62-df7cb562d938-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.027569 4914 scope.go:117] "RemoveContainer" containerID="fe9c885745ef7fe51540c5ffb342ac8e577830574ad4cda231240be188218a99" Jan 30 21:36:52 crc kubenswrapper[4914]: E0130 21:36:52.028079 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9c885745ef7fe51540c5ffb342ac8e577830574ad4cda231240be188218a99\": container with ID starting with fe9c885745ef7fe51540c5ffb342ac8e577830574ad4cda231240be188218a99 not found: ID does not exist" containerID="fe9c885745ef7fe51540c5ffb342ac8e577830574ad4cda231240be188218a99" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.028144 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9c885745ef7fe51540c5ffb342ac8e577830574ad4cda231240be188218a99"} err="failed to get container status \"fe9c885745ef7fe51540c5ffb342ac8e577830574ad4cda231240be188218a99\": rpc error: code = NotFound desc = could not find container \"fe9c885745ef7fe51540c5ffb342ac8e577830574ad4cda231240be188218a99\": container with ID starting with fe9c885745ef7fe51540c5ffb342ac8e577830574ad4cda231240be188218a99 not found: ID does not exist" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.028194 4914 scope.go:117] "RemoveContainer" containerID="cea73fc16d5fcfb062e70d44f002e4f032bc6810ec3a727cbe9b9676346ceb64" Jan 30 21:36:52 crc kubenswrapper[4914]: E0130 21:36:52.028526 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cea73fc16d5fcfb062e70d44f002e4f032bc6810ec3a727cbe9b9676346ceb64\": container with ID starting with cea73fc16d5fcfb062e70d44f002e4f032bc6810ec3a727cbe9b9676346ceb64 not found: ID does not exist" containerID="cea73fc16d5fcfb062e70d44f002e4f032bc6810ec3a727cbe9b9676346ceb64" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.028567 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea73fc16d5fcfb062e70d44f002e4f032bc6810ec3a727cbe9b9676346ceb64"} err="failed to get container status \"cea73fc16d5fcfb062e70d44f002e4f032bc6810ec3a727cbe9b9676346ceb64\": rpc error: code = NotFound desc = could not find container \"cea73fc16d5fcfb062e70d44f002e4f032bc6810ec3a727cbe9b9676346ceb64\": container with ID starting with cea73fc16d5fcfb062e70d44f002e4f032bc6810ec3a727cbe9b9676346ceb64 not found: ID does not exist" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.243639 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.255399 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.267023 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 21:36:52 crc kubenswrapper[4914]: E0130 21:36:52.267563 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6c534e-f1a8-412b-9a62-df7cb562d938" containerName="nova-api-log" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.267583 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6c534e-f1a8-412b-9a62-df7cb562d938" containerName="nova-api-log" Jan 30 21:36:52 crc kubenswrapper[4914]: E0130 21:36:52.267605 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6c534e-f1a8-412b-9a62-df7cb562d938" containerName="nova-api-api" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.267614 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6c534e-f1a8-412b-9a62-df7cb562d938" containerName="nova-api-api" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.267882 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f6c534e-f1a8-412b-9a62-df7cb562d938" containerName="nova-api-log" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.267918 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f6c534e-f1a8-412b-9a62-df7cb562d938" containerName="nova-api-api" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.269535 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.274653 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.277916 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.420552 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8480b9-d379-498c-afd7-298048f61525-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " pod="openstack/nova-api-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.421014 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbj6w\" (UniqueName: \"kubernetes.io/projected/bb8480b9-d379-498c-afd7-298048f61525-kube-api-access-cbj6w\") pod \"nova-api-0\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " pod="openstack/nova-api-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.421186 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8480b9-d379-498c-afd7-298048f61525-logs\") pod \"nova-api-0\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " pod="openstack/nova-api-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.421248 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8480b9-d379-498c-afd7-298048f61525-config-data\") pod \"nova-api-0\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " pod="openstack/nova-api-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.522932 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8480b9-d379-498c-afd7-298048f61525-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " pod="openstack/nova-api-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.522977 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbj6w\" (UniqueName: \"kubernetes.io/projected/bb8480b9-d379-498c-afd7-298048f61525-kube-api-access-cbj6w\") pod \"nova-api-0\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " pod="openstack/nova-api-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.523075 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8480b9-d379-498c-afd7-298048f61525-logs\") pod \"nova-api-0\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " pod="openstack/nova-api-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.523111 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8480b9-d379-498c-afd7-298048f61525-config-data\") pod \"nova-api-0\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " pod="openstack/nova-api-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.523874 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8480b9-d379-498c-afd7-298048f61525-logs\") pod \"nova-api-0\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " pod="openstack/nova-api-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.526610 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8480b9-d379-498c-afd7-298048f61525-config-data\") pod \"nova-api-0\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " pod="openstack/nova-api-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.527370 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8480b9-d379-498c-afd7-298048f61525-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " pod="openstack/nova-api-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.543612 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbj6w\" (UniqueName: \"kubernetes.io/projected/bb8480b9-d379-498c-afd7-298048f61525-kube-api-access-cbj6w\") pod \"nova-api-0\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " pod="openstack/nova-api-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.593941 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.696796 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.828372 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-config-data\") pod \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\" (UID: \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\") " Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.828859 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26l24\" (UniqueName: \"kubernetes.io/projected/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-kube-api-access-26l24\") pod \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\" (UID: \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\") " Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.828991 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-combined-ca-bundle\") pod \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\" (UID: \"726bfdd6-4d21-4d07-921b-ef9f28ff96c8\") " Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.833616 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-kube-api-access-26l24" (OuterVolumeSpecName: "kube-api-access-26l24") pod "726bfdd6-4d21-4d07-921b-ef9f28ff96c8" (UID: "726bfdd6-4d21-4d07-921b-ef9f28ff96c8"). InnerVolumeSpecName "kube-api-access-26l24". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.863282 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "726bfdd6-4d21-4d07-921b-ef9f28ff96c8" (UID: "726bfdd6-4d21-4d07-921b-ef9f28ff96c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.863393 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-config-data" (OuterVolumeSpecName: "config-data") pod "726bfdd6-4d21-4d07-921b-ef9f28ff96c8" (UID: "726bfdd6-4d21-4d07-921b-ef9f28ff96c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.905091 4914 generic.go:334] "Generic (PLEG): container finished" podID="726bfdd6-4d21-4d07-921b-ef9f28ff96c8" containerID="1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14" exitCode=0 Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.905155 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"726bfdd6-4d21-4d07-921b-ef9f28ff96c8","Type":"ContainerDied","Data":"1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14"} Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.905184 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"726bfdd6-4d21-4d07-921b-ef9f28ff96c8","Type":"ContainerDied","Data":"c23323013ecd18323321984eb91eaf18cf0e1a9c296675558182fd52116db3ab"} Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.905204 4914 scope.go:117] "RemoveContainer" containerID="1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.905313 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.931216 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26l24\" (UniqueName: \"kubernetes.io/projected/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-kube-api-access-26l24\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.931259 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.931273 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726bfdd6-4d21-4d07-921b-ef9f28ff96c8-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.931383 4914 scope.go:117] "RemoveContainer" containerID="1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14" Jan 30 21:36:52 crc kubenswrapper[4914]: E0130 21:36:52.931903 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14\": container with ID starting with 1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14 not found: ID does not exist" containerID="1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.931932 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14"} err="failed to get container status \"1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14\": rpc error: code = NotFound desc = could not find container \"1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14\": container with ID starting with 1dd3456653dbc646ba4680e3c4a050ef7892f6c2505fb4ccb1d00ba28028eb14 not found: ID does not exist" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.937619 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.953132 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.966030 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:36:52 crc kubenswrapper[4914]: E0130 21:36:52.966439 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726bfdd6-4d21-4d07-921b-ef9f28ff96c8" containerName="nova-scheduler-scheduler" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.966457 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="726bfdd6-4d21-4d07-921b-ef9f28ff96c8" containerName="nova-scheduler-scheduler" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.966644 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="726bfdd6-4d21-4d07-921b-ef9f28ff96c8" containerName="nova-scheduler-scheduler" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.967366 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.969451 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 21:36:52 crc kubenswrapper[4914]: I0130 21:36:52.981561 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.033513 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-config-data\") pod \"nova-scheduler-0\" (UID: \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.033632 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.033662 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5fkj\" (UniqueName: \"kubernetes.io/projected/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-kube-api-access-l5fkj\") pod \"nova-scheduler-0\" (UID: \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.065136 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:36:53 crc kubenswrapper[4914]: W0130 21:36:53.066739 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb8480b9_d379_498c_afd7_298048f61525.slice/crio-415b264e75bab2591d601af788ebab88bb7f55048a82bd19b4544e37cd4e6b7b WatchSource:0}: Error finding container 415b264e75bab2591d601af788ebab88bb7f55048a82bd19b4544e37cd4e6b7b: Status 404 returned error can't find the container with id 415b264e75bab2591d601af788ebab88bb7f55048a82bd19b4544e37cd4e6b7b Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.135130 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.135180 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5fkj\" (UniqueName: \"kubernetes.io/projected/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-kube-api-access-l5fkj\") pod \"nova-scheduler-0\" (UID: \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.135267 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-config-data\") pod \"nova-scheduler-0\" (UID: \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.138510 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-config-data\") pod \"nova-scheduler-0\" (UID: \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.139640 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.152381 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5fkj\" (UniqueName: \"kubernetes.io/projected/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-kube-api-access-l5fkj\") pod \"nova-scheduler-0\" (UID: \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\") " pod="openstack/nova-scheduler-0" Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.287422 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:36:53 crc kubenswrapper[4914]: W0130 21:36:53.737541 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcf18dbe_0b07_4ae2_8398_c29fe48daaff.slice/crio-0cd023662c5b3a5f143c1787ebae0636c02174c80f655809b7e398c9a658295d WatchSource:0}: Error finding container 0cd023662c5b3a5f143c1787ebae0636c02174c80f655809b7e398c9a658295d: Status 404 returned error can't find the container with id 0cd023662c5b3a5f143c1787ebae0636c02174c80f655809b7e398c9a658295d Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.746831 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.833626 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f6c534e-f1a8-412b-9a62-df7cb562d938" path="/var/lib/kubelet/pods/3f6c534e-f1a8-412b-9a62-df7cb562d938/volumes" Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.834209 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="726bfdd6-4d21-4d07-921b-ef9f28ff96c8" path="/var/lib/kubelet/pods/726bfdd6-4d21-4d07-921b-ef9f28ff96c8/volumes" Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.917972 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcf18dbe-0b07-4ae2-8398-c29fe48daaff","Type":"ContainerStarted","Data":"0cd023662c5b3a5f143c1787ebae0636c02174c80f655809b7e398c9a658295d"} Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.921224 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb8480b9-d379-498c-afd7-298048f61525","Type":"ContainerStarted","Data":"ac0d9141779769d1dac8d0bfc66028e97caca8621bc0c21765bad399ead8eea0"} Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.921269 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb8480b9-d379-498c-afd7-298048f61525","Type":"ContainerStarted","Data":"6fe950d6a876a8f391802b5d4708febfa4e17565eada3cc1e166ae0ecdf0042b"} Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.921284 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb8480b9-d379-498c-afd7-298048f61525","Type":"ContainerStarted","Data":"415b264e75bab2591d601af788ebab88bb7f55048a82bd19b4544e37cd4e6b7b"} Jan 30 21:36:53 crc kubenswrapper[4914]: I0130 21:36:53.940384 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.940362612 podStartE2EDuration="1.940362612s" podCreationTimestamp="2026-01-30 21:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:53.93525556 +0000 UTC m=+1347.373892321" watchObservedRunningTime="2026-01-30 21:36:53.940362612 +0000 UTC m=+1347.378999373" Jan 30 21:36:54 crc kubenswrapper[4914]: I0130 21:36:54.271805 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 21:36:54 crc kubenswrapper[4914]: I0130 21:36:54.271878 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 21:36:54 crc kubenswrapper[4914]: I0130 21:36:54.933416 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcf18dbe-0b07-4ae2-8398-c29fe48daaff","Type":"ContainerStarted","Data":"fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523"} Jan 30 21:36:54 crc kubenswrapper[4914]: I0130 21:36:54.957996 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.957974787 podStartE2EDuration="2.957974787s" podCreationTimestamp="2026-01-30 21:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:54.949179726 +0000 UTC m=+1348.387816487" watchObservedRunningTime="2026-01-30 21:36:54.957974787 +0000 UTC m=+1348.396611568" Jan 30 21:36:54 crc kubenswrapper[4914]: I0130 21:36:54.980311 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 21:36:58 crc kubenswrapper[4914]: I0130 21:36:58.273765 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 30 21:36:58 crc kubenswrapper[4914]: I0130 21:36:58.287850 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 21:36:59 crc kubenswrapper[4914]: I0130 21:36:59.271578 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 21:36:59 crc kubenswrapper[4914]: I0130 21:36:59.271654 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 21:37:00 crc kubenswrapper[4914]: I0130 21:37:00.283839 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6009160e-a137-406a-9993-ce86e2236110" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:37:00 crc kubenswrapper[4914]: I0130 21:37:00.283888 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6009160e-a137-406a-9993-ce86e2236110" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:37:02 crc kubenswrapper[4914]: I0130 21:37:02.595148 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 21:37:02 crc kubenswrapper[4914]: I0130 21:37:02.595519 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 21:37:03 crc kubenswrapper[4914]: I0130 21:37:03.288550 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 21:37:03 crc kubenswrapper[4914]: I0130 21:37:03.324113 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 21:37:03 crc kubenswrapper[4914]: I0130 21:37:03.678251 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bb8480b9-d379-498c-afd7-298048f61525" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 21:37:03 crc kubenswrapper[4914]: I0130 21:37:03.678925 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bb8480b9-d379-498c-afd7-298048f61525" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 21:37:04 crc kubenswrapper[4914]: I0130 21:37:04.069095 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 21:37:09 crc kubenswrapper[4914]: I0130 21:37:09.278088 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 21:37:09 crc kubenswrapper[4914]: I0130 21:37:09.278771 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 21:37:09 crc kubenswrapper[4914]: I0130 21:37:09.287386 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 21:37:09 crc kubenswrapper[4914]: I0130 21:37:09.289397 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.118506 4914 generic.go:334] "Generic (PLEG): container finished" podID="ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0" containerID="e71fc1513988061b645764815fbe699a0f4c556ff0363418c53f989c84361f40" exitCode=137 Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.118543 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0","Type":"ContainerDied","Data":"e71fc1513988061b645764815fbe699a0f4c556ff0363418c53f989c84361f40"} Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.282449 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.365005 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-config-data\") pod \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\" (UID: \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\") " Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.365259 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-combined-ca-bundle\") pod \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\" (UID: \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\") " Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.365408 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9tm4\" (UniqueName: \"kubernetes.io/projected/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-kube-api-access-g9tm4\") pod \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\" (UID: \"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0\") " Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.370313 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-kube-api-access-g9tm4" (OuterVolumeSpecName: "kube-api-access-g9tm4") pod "ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0" (UID: "ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0"). InnerVolumeSpecName "kube-api-access-g9tm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.396119 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-config-data" (OuterVolumeSpecName: "config-data") pod "ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0" (UID: "ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.396626 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0" (UID: "ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.468551 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.468602 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9tm4\" (UniqueName: \"kubernetes.io/projected/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-kube-api-access-g9tm4\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.468617 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.598243 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.599535 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.599853 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 21:37:12 crc kubenswrapper[4914]: I0130 21:37:12.603031 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.132562 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0","Type":"ContainerDied","Data":"48dc8c08272759bfe3e4c83118962687e0fe237cd51baddf66eb8a0eaa8b1a82"} Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.132857 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.133009 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.133028 4914 scope.go:117] "RemoveContainer" containerID="e71fc1513988061b645764815fbe699a0f4c556ff0363418c53f989c84361f40" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.141420 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.210021 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.224349 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.253123 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:37:13 crc kubenswrapper[4914]: E0130 21:37:13.253655 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.253677 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.253925 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.255331 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.262324 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.262507 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.262652 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.295662 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.376591 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-5c58f"] Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.378672 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.387791 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/60990201-067f-48d5-8a61-1a117609cfc7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.387835 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/60990201-067f-48d5-8a61-1a117609cfc7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.387886 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2ln\" (UniqueName: \"kubernetes.io/projected/60990201-067f-48d5-8a61-1a117609cfc7-kube-api-access-vr2ln\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.387988 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60990201-067f-48d5-8a61-1a117609cfc7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.388068 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60990201-067f-48d5-8a61-1a117609cfc7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.431381 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-5c58f"] Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.489900 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.489970 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60990201-067f-48d5-8a61-1a117609cfc7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.490011 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghq2q\" (UniqueName: \"kubernetes.io/projected/b4b59e65-251e-4633-8961-af93c0b108ce-kube-api-access-ghq2q\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.490048 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.490079 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-config\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.490122 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.490151 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60990201-067f-48d5-8a61-1a117609cfc7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.490177 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/60990201-067f-48d5-8a61-1a117609cfc7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.490192 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/60990201-067f-48d5-8a61-1a117609cfc7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.490216 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.490247 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2ln\" (UniqueName: \"kubernetes.io/projected/60990201-067f-48d5-8a61-1a117609cfc7-kube-api-access-vr2ln\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.495612 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/60990201-067f-48d5-8a61-1a117609cfc7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.495965 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/60990201-067f-48d5-8a61-1a117609cfc7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.496224 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60990201-067f-48d5-8a61-1a117609cfc7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.509106 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2ln\" (UniqueName: \"kubernetes.io/projected/60990201-067f-48d5-8a61-1a117609cfc7-kube-api-access-vr2ln\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.523103 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60990201-067f-48d5-8a61-1a117609cfc7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"60990201-067f-48d5-8a61-1a117609cfc7\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.592623 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghq2q\" (UniqueName: \"kubernetes.io/projected/b4b59e65-251e-4633-8961-af93c0b108ce-kube-api-access-ghq2q\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.593003 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.593034 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-config\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.593112 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.593200 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.593342 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.594226 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-config\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.594389 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.594803 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.595318 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.596120 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.599543 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.612920 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghq2q\" (UniqueName: \"kubernetes.io/projected/b4b59e65-251e-4633-8961-af93c0b108ce-kube-api-access-ghq2q\") pod \"dnsmasq-dns-5fd9b586ff-5c58f\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.709698 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:13 crc kubenswrapper[4914]: I0130 21:37:13.831797 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0" path="/var/lib/kubelet/pods/ed7d4288-0cdf-4d5a-86dd-2c82b1c6b6d0/volumes" Jan 30 21:37:14 crc kubenswrapper[4914]: W0130 21:37:14.088049 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60990201_067f_48d5_8a61_1a117609cfc7.slice/crio-fd8a65b56347f0c5cd50695426060eb2b2a431702fba9e1e09c18ab270c63c15 WatchSource:0}: Error finding container fd8a65b56347f0c5cd50695426060eb2b2a431702fba9e1e09c18ab270c63c15: Status 404 returned error can't find the container with id fd8a65b56347f0c5cd50695426060eb2b2a431702fba9e1e09c18ab270c63c15 Jan 30 21:37:14 crc kubenswrapper[4914]: I0130 21:37:14.090796 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:37:14 crc kubenswrapper[4914]: I0130 21:37:14.145422 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"60990201-067f-48d5-8a61-1a117609cfc7","Type":"ContainerStarted","Data":"fd8a65b56347f0c5cd50695426060eb2b2a431702fba9e1e09c18ab270c63c15"} Jan 30 21:37:14 crc kubenswrapper[4914]: I0130 21:37:14.324612 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-5c58f"] Jan 30 21:37:14 crc kubenswrapper[4914]: W0130 21:37:14.330942 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4b59e65_251e_4633_8961_af93c0b108ce.slice/crio-1706a513dfd35bac292c5642bc7fffbc18d602dc4687a951766358d23dc143c6 WatchSource:0}: Error finding container 1706a513dfd35bac292c5642bc7fffbc18d602dc4687a951766358d23dc143c6: Status 404 returned error can't find the container with id 1706a513dfd35bac292c5642bc7fffbc18d602dc4687a951766358d23dc143c6 Jan 30 21:37:15 crc kubenswrapper[4914]: I0130 21:37:15.178696 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"60990201-067f-48d5-8a61-1a117609cfc7","Type":"ContainerStarted","Data":"25f874aab4cbc4d53b1c3364ce0041d0a99a2675a648f03365d7fe769edce2bd"} Jan 30 21:37:15 crc kubenswrapper[4914]: I0130 21:37:15.192221 4914 generic.go:334] "Generic (PLEG): container finished" podID="b4b59e65-251e-4633-8961-af93c0b108ce" containerID="0e309fb1c61c59b3d28900dd5b24929b5f96e0bdce1a951b9d3c8868383c76ba" exitCode=0 Jan 30 21:37:15 crc kubenswrapper[4914]: I0130 21:37:15.192272 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" event={"ID":"b4b59e65-251e-4633-8961-af93c0b108ce","Type":"ContainerDied","Data":"0e309fb1c61c59b3d28900dd5b24929b5f96e0bdce1a951b9d3c8868383c76ba"} Jan 30 21:37:15 crc kubenswrapper[4914]: I0130 21:37:15.192310 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" event={"ID":"b4b59e65-251e-4633-8961-af93c0b108ce","Type":"ContainerStarted","Data":"1706a513dfd35bac292c5642bc7fffbc18d602dc4687a951766358d23dc143c6"} Jan 30 21:37:15 crc kubenswrapper[4914]: I0130 21:37:15.242949 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.242931913 podStartE2EDuration="2.242931913s" podCreationTimestamp="2026-01-30 21:37:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:15.197160844 +0000 UTC m=+1368.635797605" watchObservedRunningTime="2026-01-30 21:37:15.242931913 +0000 UTC m=+1368.681568674" Jan 30 21:37:16 crc kubenswrapper[4914]: I0130 21:37:16.040865 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:37:16 crc kubenswrapper[4914]: I0130 21:37:16.041512 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="ceilometer-central-agent" containerID="cri-o://6ee41713bca317d7950e050803df4f6804517538bfdb6dd6e6b456e5eb980a31" gracePeriod=30 Jan 30 21:37:16 crc kubenswrapper[4914]: I0130 21:37:16.041834 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="sg-core" containerID="cri-o://1082f0a1f0010cc8e2cddfff1d876a7c3d03e0fad6ec742bb9926dc201abce9e" gracePeriod=30 Jan 30 21:37:16 crc kubenswrapper[4914]: I0130 21:37:16.041896 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="ceilometer-notification-agent" containerID="cri-o://e2fe73ee6674bb7593373e1f13d591c36ac34a0957725a6b00c30c1d228f2531" gracePeriod=30 Jan 30 21:37:16 crc kubenswrapper[4914]: I0130 21:37:16.041519 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="proxy-httpd" containerID="cri-o://78f82eb10120432a9f7faa25694bd8c695ddcb39405158020a62c34657ca7a59" gracePeriod=30 Jan 30 21:37:16 crc kubenswrapper[4914]: I0130 21:37:16.191269 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:37:16 crc kubenswrapper[4914]: I0130 21:37:16.203958 4914 generic.go:334] "Generic (PLEG): container finished" podID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerID="1082f0a1f0010cc8e2cddfff1d876a7c3d03e0fad6ec742bb9926dc201abce9e" exitCode=2 Jan 30 21:37:16 crc kubenswrapper[4914]: I0130 21:37:16.204158 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9616e365-c35d-4adb-96e5-81c1e34c7068","Type":"ContainerDied","Data":"1082f0a1f0010cc8e2cddfff1d876a7c3d03e0fad6ec742bb9926dc201abce9e"} Jan 30 21:37:16 crc kubenswrapper[4914]: I0130 21:37:16.209360 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bb8480b9-d379-498c-afd7-298048f61525" containerName="nova-api-log" containerID="cri-o://6fe950d6a876a8f391802b5d4708febfa4e17565eada3cc1e166ae0ecdf0042b" gracePeriod=30 Jan 30 21:37:16 crc kubenswrapper[4914]: I0130 21:37:16.209865 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" event={"ID":"b4b59e65-251e-4633-8961-af93c0b108ce","Type":"ContainerStarted","Data":"7d55f367f6c5b31b760a00857883921d36ad4779e2db6e4ff3c4418689116835"} Jan 30 21:37:16 crc kubenswrapper[4914]: I0130 21:37:16.210970 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bb8480b9-d379-498c-afd7-298048f61525" containerName="nova-api-api" containerID="cri-o://ac0d9141779769d1dac8d0bfc66028e97caca8621bc0c21765bad399ead8eea0" gracePeriod=30 Jan 30 21:37:16 crc kubenswrapper[4914]: I0130 21:37:16.211602 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:16 crc kubenswrapper[4914]: I0130 21:37:16.246276 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" podStartSLOduration=3.24625874 podStartE2EDuration="3.24625874s" podCreationTimestamp="2026-01-30 21:37:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:16.244763893 +0000 UTC m=+1369.683400654" watchObservedRunningTime="2026-01-30 21:37:16.24625874 +0000 UTC m=+1369.684895501" Jan 30 21:37:17 crc kubenswrapper[4914]: I0130 21:37:17.232445 4914 generic.go:334] "Generic (PLEG): container finished" podID="bb8480b9-d379-498c-afd7-298048f61525" containerID="6fe950d6a876a8f391802b5d4708febfa4e17565eada3cc1e166ae0ecdf0042b" exitCode=143 Jan 30 21:37:17 crc kubenswrapper[4914]: I0130 21:37:17.232905 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb8480b9-d379-498c-afd7-298048f61525","Type":"ContainerDied","Data":"6fe950d6a876a8f391802b5d4708febfa4e17565eada3cc1e166ae0ecdf0042b"} Jan 30 21:37:17 crc kubenswrapper[4914]: I0130 21:37:17.238107 4914 generic.go:334] "Generic (PLEG): container finished" podID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerID="78f82eb10120432a9f7faa25694bd8c695ddcb39405158020a62c34657ca7a59" exitCode=0 Jan 30 21:37:17 crc kubenswrapper[4914]: I0130 21:37:17.238164 4914 generic.go:334] "Generic (PLEG): container finished" podID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerID="6ee41713bca317d7950e050803df4f6804517538bfdb6dd6e6b456e5eb980a31" exitCode=0 Jan 30 21:37:17 crc kubenswrapper[4914]: I0130 21:37:17.238255 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9616e365-c35d-4adb-96e5-81c1e34c7068","Type":"ContainerDied","Data":"78f82eb10120432a9f7faa25694bd8c695ddcb39405158020a62c34657ca7a59"} Jan 30 21:37:17 crc kubenswrapper[4914]: I0130 21:37:17.238313 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9616e365-c35d-4adb-96e5-81c1e34c7068","Type":"ContainerDied","Data":"6ee41713bca317d7950e050803df4f6804517538bfdb6dd6e6b456e5eb980a31"} Jan 30 21:37:18 crc kubenswrapper[4914]: I0130 21:37:18.600203 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:19 crc kubenswrapper[4914]: I0130 21:37:19.888200 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:37:19 crc kubenswrapper[4914]: I0130 21:37:19.952969 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-config-data\") pod \"9616e365-c35d-4adb-96e5-81c1e34c7068\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " Jan 30 21:37:19 crc kubenswrapper[4914]: I0130 21:37:19.953016 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-combined-ca-bundle\") pod \"9616e365-c35d-4adb-96e5-81c1e34c7068\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " Jan 30 21:37:19 crc kubenswrapper[4914]: I0130 21:37:19.953063 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-ceilometer-tls-certs\") pod \"9616e365-c35d-4adb-96e5-81c1e34c7068\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " Jan 30 21:37:19 crc kubenswrapper[4914]: I0130 21:37:19.953104 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-sg-core-conf-yaml\") pod \"9616e365-c35d-4adb-96e5-81c1e34c7068\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " Jan 30 21:37:19 crc kubenswrapper[4914]: I0130 21:37:19.953163 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-scripts\") pod \"9616e365-c35d-4adb-96e5-81c1e34c7068\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " Jan 30 21:37:19 crc kubenswrapper[4914]: I0130 21:37:19.953250 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgd8p\" (UniqueName: \"kubernetes.io/projected/9616e365-c35d-4adb-96e5-81c1e34c7068-kube-api-access-bgd8p\") pod \"9616e365-c35d-4adb-96e5-81c1e34c7068\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " Jan 30 21:37:19 crc kubenswrapper[4914]: I0130 21:37:19.953315 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9616e365-c35d-4adb-96e5-81c1e34c7068-log-httpd\") pod \"9616e365-c35d-4adb-96e5-81c1e34c7068\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " Jan 30 21:37:19 crc kubenswrapper[4914]: I0130 21:37:19.953410 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9616e365-c35d-4adb-96e5-81c1e34c7068-run-httpd\") pod \"9616e365-c35d-4adb-96e5-81c1e34c7068\" (UID: \"9616e365-c35d-4adb-96e5-81c1e34c7068\") " Jan 30 21:37:19 crc kubenswrapper[4914]: I0130 21:37:19.957949 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9616e365-c35d-4adb-96e5-81c1e34c7068-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9616e365-c35d-4adb-96e5-81c1e34c7068" (UID: "9616e365-c35d-4adb-96e5-81c1e34c7068"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:37:19 crc kubenswrapper[4914]: I0130 21:37:19.958175 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9616e365-c35d-4adb-96e5-81c1e34c7068-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9616e365-c35d-4adb-96e5-81c1e34c7068" (UID: "9616e365-c35d-4adb-96e5-81c1e34c7068"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:37:19 crc kubenswrapper[4914]: I0130 21:37:19.968187 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-scripts" (OuterVolumeSpecName: "scripts") pod "9616e365-c35d-4adb-96e5-81c1e34c7068" (UID: "9616e365-c35d-4adb-96e5-81c1e34c7068"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:19 crc kubenswrapper[4914]: I0130 21:37:19.975221 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9616e365-c35d-4adb-96e5-81c1e34c7068-kube-api-access-bgd8p" (OuterVolumeSpecName: "kube-api-access-bgd8p") pod "9616e365-c35d-4adb-96e5-81c1e34c7068" (UID: "9616e365-c35d-4adb-96e5-81c1e34c7068"). InnerVolumeSpecName "kube-api-access-bgd8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:19 crc kubenswrapper[4914]: I0130 21:37:19.988389 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.034745 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9616e365-c35d-4adb-96e5-81c1e34c7068" (UID: "9616e365-c35d-4adb-96e5-81c1e34c7068"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.055059 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8480b9-d379-498c-afd7-298048f61525-config-data\") pod \"bb8480b9-d379-498c-afd7-298048f61525\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.055248 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8480b9-d379-498c-afd7-298048f61525-combined-ca-bundle\") pod \"bb8480b9-d379-498c-afd7-298048f61525\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.055351 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbj6w\" (UniqueName: \"kubernetes.io/projected/bb8480b9-d379-498c-afd7-298048f61525-kube-api-access-cbj6w\") pod \"bb8480b9-d379-498c-afd7-298048f61525\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.055398 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8480b9-d379-498c-afd7-298048f61525-logs\") pod \"bb8480b9-d379-498c-afd7-298048f61525\" (UID: \"bb8480b9-d379-498c-afd7-298048f61525\") " Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.055887 4914 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9616e365-c35d-4adb-96e5-81c1e34c7068-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.055908 4914 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.055919 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.055928 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgd8p\" (UniqueName: \"kubernetes.io/projected/9616e365-c35d-4adb-96e5-81c1e34c7068-kube-api-access-bgd8p\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.055937 4914 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9616e365-c35d-4adb-96e5-81c1e34c7068-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.056289 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb8480b9-d379-498c-afd7-298048f61525-logs" (OuterVolumeSpecName: "logs") pod "bb8480b9-d379-498c-afd7-298048f61525" (UID: "bb8480b9-d379-498c-afd7-298048f61525"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.069449 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8480b9-d379-498c-afd7-298048f61525-kube-api-access-cbj6w" (OuterVolumeSpecName: "kube-api-access-cbj6w") pod "bb8480b9-d379-498c-afd7-298048f61525" (UID: "bb8480b9-d379-498c-afd7-298048f61525"). InnerVolumeSpecName "kube-api-access-cbj6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.072493 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9616e365-c35d-4adb-96e5-81c1e34c7068" (UID: "9616e365-c35d-4adb-96e5-81c1e34c7068"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.082111 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9616e365-c35d-4adb-96e5-81c1e34c7068" (UID: "9616e365-c35d-4adb-96e5-81c1e34c7068"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.095759 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8480b9-d379-498c-afd7-298048f61525-config-data" (OuterVolumeSpecName: "config-data") pod "bb8480b9-d379-498c-afd7-298048f61525" (UID: "bb8480b9-d379-498c-afd7-298048f61525"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.111669 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-config-data" (OuterVolumeSpecName: "config-data") pod "9616e365-c35d-4adb-96e5-81c1e34c7068" (UID: "9616e365-c35d-4adb-96e5-81c1e34c7068"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.124898 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8480b9-d379-498c-afd7-298048f61525-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb8480b9-d379-498c-afd7-298048f61525" (UID: "bb8480b9-d379-498c-afd7-298048f61525"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.158045 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8480b9-d379-498c-afd7-298048f61525-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.158074 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbj6w\" (UniqueName: \"kubernetes.io/projected/bb8480b9-d379-498c-afd7-298048f61525-kube-api-access-cbj6w\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.158086 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8480b9-d379-498c-afd7-298048f61525-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.158095 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.158103 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.158111 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8480b9-d379-498c-afd7-298048f61525-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.158119 4914 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9616e365-c35d-4adb-96e5-81c1e34c7068-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.283202 4914 generic.go:334] "Generic (PLEG): container finished" podID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerID="e2fe73ee6674bb7593373e1f13d591c36ac34a0957725a6b00c30c1d228f2531" exitCode=0 Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.283235 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9616e365-c35d-4adb-96e5-81c1e34c7068","Type":"ContainerDied","Data":"e2fe73ee6674bb7593373e1f13d591c36ac34a0957725a6b00c30c1d228f2531"} Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.283281 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.283339 4914 scope.go:117] "RemoveContainer" containerID="78f82eb10120432a9f7faa25694bd8c695ddcb39405158020a62c34657ca7a59" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.283687 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9616e365-c35d-4adb-96e5-81c1e34c7068","Type":"ContainerDied","Data":"572158e158ce20e5570b482ce8586130f37943ff6bebe1243dbbd358d75405d7"} Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.285319 4914 generic.go:334] "Generic (PLEG): container finished" podID="bb8480b9-d379-498c-afd7-298048f61525" containerID="ac0d9141779769d1dac8d0bfc66028e97caca8621bc0c21765bad399ead8eea0" exitCode=0 Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.285368 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb8480b9-d379-498c-afd7-298048f61525","Type":"ContainerDied","Data":"ac0d9141779769d1dac8d0bfc66028e97caca8621bc0c21765bad399ead8eea0"} Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.285404 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb8480b9-d379-498c-afd7-298048f61525","Type":"ContainerDied","Data":"415b264e75bab2591d601af788ebab88bb7f55048a82bd19b4544e37cd4e6b7b"} Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.285484 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.356658 4914 scope.go:117] "RemoveContainer" containerID="1082f0a1f0010cc8e2cddfff1d876a7c3d03e0fad6ec742bb9926dc201abce9e" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.381539 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.395586 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.407453 4914 scope.go:117] "RemoveContainer" containerID="e2fe73ee6674bb7593373e1f13d591c36ac34a0957725a6b00c30c1d228f2531" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.415694 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.434991 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.438460 4914 scope.go:117] "RemoveContainer" containerID="6ee41713bca317d7950e050803df4f6804517538bfdb6dd6e6b456e5eb980a31" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.445544 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 21:37:20 crc kubenswrapper[4914]: E0130 21:37:20.446311 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8480b9-d379-498c-afd7-298048f61525" containerName="nova-api-api" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.446326 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8480b9-d379-498c-afd7-298048f61525" containerName="nova-api-api" Jan 30 21:37:20 crc kubenswrapper[4914]: E0130 21:37:20.446337 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="ceilometer-notification-agent" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.446345 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="ceilometer-notification-agent" Jan 30 21:37:20 crc kubenswrapper[4914]: E0130 21:37:20.446357 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="ceilometer-central-agent" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.446362 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="ceilometer-central-agent" Jan 30 21:37:20 crc kubenswrapper[4914]: E0130 21:37:20.446374 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8480b9-d379-498c-afd7-298048f61525" containerName="nova-api-log" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.446379 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8480b9-d379-498c-afd7-298048f61525" containerName="nova-api-log" Jan 30 21:37:20 crc kubenswrapper[4914]: E0130 21:37:20.446395 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="sg-core" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.446401 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="sg-core" Jan 30 21:37:20 crc kubenswrapper[4914]: E0130 21:37:20.446419 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="proxy-httpd" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.446424 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="proxy-httpd" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.446596 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="sg-core" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.446612 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="proxy-httpd" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.446623 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="ceilometer-notification-agent" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.446631 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8480b9-d379-498c-afd7-298048f61525" containerName="nova-api-api" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.446639 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" containerName="ceilometer-central-agent" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.446655 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8480b9-d379-498c-afd7-298048f61525" containerName="nova-api-log" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.447659 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.449982 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.450041 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.450286 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.473936 4914 scope.go:117] "RemoveContainer" containerID="78f82eb10120432a9f7faa25694bd8c695ddcb39405158020a62c34657ca7a59" Jan 30 21:37:20 crc kubenswrapper[4914]: E0130 21:37:20.474379 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78f82eb10120432a9f7faa25694bd8c695ddcb39405158020a62c34657ca7a59\": container with ID starting with 78f82eb10120432a9f7faa25694bd8c695ddcb39405158020a62c34657ca7a59 not found: ID does not exist" containerID="78f82eb10120432a9f7faa25694bd8c695ddcb39405158020a62c34657ca7a59" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.474407 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78f82eb10120432a9f7faa25694bd8c695ddcb39405158020a62c34657ca7a59"} err="failed to get container status \"78f82eb10120432a9f7faa25694bd8c695ddcb39405158020a62c34657ca7a59\": rpc error: code = NotFound desc = could not find container \"78f82eb10120432a9f7faa25694bd8c695ddcb39405158020a62c34657ca7a59\": container with ID starting with 78f82eb10120432a9f7faa25694bd8c695ddcb39405158020a62c34657ca7a59 not found: ID does not exist" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.474426 4914 scope.go:117] "RemoveContainer" containerID="1082f0a1f0010cc8e2cddfff1d876a7c3d03e0fad6ec742bb9926dc201abce9e" Jan 30 21:37:20 crc kubenswrapper[4914]: E0130 21:37:20.474605 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1082f0a1f0010cc8e2cddfff1d876a7c3d03e0fad6ec742bb9926dc201abce9e\": container with ID starting with 1082f0a1f0010cc8e2cddfff1d876a7c3d03e0fad6ec742bb9926dc201abce9e not found: ID does not exist" containerID="1082f0a1f0010cc8e2cddfff1d876a7c3d03e0fad6ec742bb9926dc201abce9e" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.474623 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1082f0a1f0010cc8e2cddfff1d876a7c3d03e0fad6ec742bb9926dc201abce9e"} err="failed to get container status \"1082f0a1f0010cc8e2cddfff1d876a7c3d03e0fad6ec742bb9926dc201abce9e\": rpc error: code = NotFound desc = could not find container \"1082f0a1f0010cc8e2cddfff1d876a7c3d03e0fad6ec742bb9926dc201abce9e\": container with ID starting with 1082f0a1f0010cc8e2cddfff1d876a7c3d03e0fad6ec742bb9926dc201abce9e not found: ID does not exist" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.474636 4914 scope.go:117] "RemoveContainer" containerID="e2fe73ee6674bb7593373e1f13d591c36ac34a0957725a6b00c30c1d228f2531" Jan 30 21:37:20 crc kubenswrapper[4914]: E0130 21:37:20.474812 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2fe73ee6674bb7593373e1f13d591c36ac34a0957725a6b00c30c1d228f2531\": container with ID starting with e2fe73ee6674bb7593373e1f13d591c36ac34a0957725a6b00c30c1d228f2531 not found: ID does not exist" containerID="e2fe73ee6674bb7593373e1f13d591c36ac34a0957725a6b00c30c1d228f2531" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.474828 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2fe73ee6674bb7593373e1f13d591c36ac34a0957725a6b00c30c1d228f2531"} err="failed to get container status \"e2fe73ee6674bb7593373e1f13d591c36ac34a0957725a6b00c30c1d228f2531\": rpc error: code = NotFound desc = could not find container \"e2fe73ee6674bb7593373e1f13d591c36ac34a0957725a6b00c30c1d228f2531\": container with ID starting with e2fe73ee6674bb7593373e1f13d591c36ac34a0957725a6b00c30c1d228f2531 not found: ID does not exist" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.474840 4914 scope.go:117] "RemoveContainer" containerID="6ee41713bca317d7950e050803df4f6804517538bfdb6dd6e6b456e5eb980a31" Jan 30 21:37:20 crc kubenswrapper[4914]: E0130 21:37:20.475008 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee41713bca317d7950e050803df4f6804517538bfdb6dd6e6b456e5eb980a31\": container with ID starting with 6ee41713bca317d7950e050803df4f6804517538bfdb6dd6e6b456e5eb980a31 not found: ID does not exist" containerID="6ee41713bca317d7950e050803df4f6804517538bfdb6dd6e6b456e5eb980a31" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.475026 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee41713bca317d7950e050803df4f6804517538bfdb6dd6e6b456e5eb980a31"} err="failed to get container status \"6ee41713bca317d7950e050803df4f6804517538bfdb6dd6e6b456e5eb980a31\": rpc error: code = NotFound desc = could not find container \"6ee41713bca317d7950e050803df4f6804517538bfdb6dd6e6b456e5eb980a31\": container with ID starting with 6ee41713bca317d7950e050803df4f6804517538bfdb6dd6e6b456e5eb980a31 not found: ID does not exist" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.475036 4914 scope.go:117] "RemoveContainer" containerID="ac0d9141779769d1dac8d0bfc66028e97caca8621bc0c21765bad399ead8eea0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.475828 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.501920 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.507875 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.511782 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.511799 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.514527 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.516218 4914 scope.go:117] "RemoveContainer" containerID="6fe950d6a876a8f391802b5d4708febfa4e17565eada3cc1e166ae0ecdf0042b" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.582412 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.583060 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpqdx\" (UniqueName: \"kubernetes.io/projected/7c7c7644-0241-4145-b369-89153413fd39-kube-api-access-vpqdx\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.583144 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-public-tls-certs\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.583170 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-config-data\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.583200 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.583238 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-config-data\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.583288 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7c7644-0241-4145-b369-89153413fd39-logs\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.583388 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0925713-10df-46ab-b311-1a286b6c4515-run-httpd\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.583563 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.583799 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvgcr\" (UniqueName: \"kubernetes.io/projected/b0925713-10df-46ab-b311-1a286b6c4515-kube-api-access-fvgcr\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.583880 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.583959 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.584018 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-scripts\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.584144 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0925713-10df-46ab-b311-1a286b6c4515-log-httpd\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.611086 4914 scope.go:117] "RemoveContainer" containerID="ac0d9141779769d1dac8d0bfc66028e97caca8621bc0c21765bad399ead8eea0" Jan 30 21:37:20 crc kubenswrapper[4914]: E0130 21:37:20.624291 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac0d9141779769d1dac8d0bfc66028e97caca8621bc0c21765bad399ead8eea0\": container with ID starting with ac0d9141779769d1dac8d0bfc66028e97caca8621bc0c21765bad399ead8eea0 not found: ID does not exist" containerID="ac0d9141779769d1dac8d0bfc66028e97caca8621bc0c21765bad399ead8eea0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.624364 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac0d9141779769d1dac8d0bfc66028e97caca8621bc0c21765bad399ead8eea0"} err="failed to get container status \"ac0d9141779769d1dac8d0bfc66028e97caca8621bc0c21765bad399ead8eea0\": rpc error: code = NotFound desc = could not find container \"ac0d9141779769d1dac8d0bfc66028e97caca8621bc0c21765bad399ead8eea0\": container with ID starting with ac0d9141779769d1dac8d0bfc66028e97caca8621bc0c21765bad399ead8eea0 not found: ID does not exist" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.624410 4914 scope.go:117] "RemoveContainer" containerID="6fe950d6a876a8f391802b5d4708febfa4e17565eada3cc1e166ae0ecdf0042b" Jan 30 21:37:20 crc kubenswrapper[4914]: E0130 21:37:20.636001 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fe950d6a876a8f391802b5d4708febfa4e17565eada3cc1e166ae0ecdf0042b\": container with ID starting with 6fe950d6a876a8f391802b5d4708febfa4e17565eada3cc1e166ae0ecdf0042b not found: ID does not exist" containerID="6fe950d6a876a8f391802b5d4708febfa4e17565eada3cc1e166ae0ecdf0042b" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.636075 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fe950d6a876a8f391802b5d4708febfa4e17565eada3cc1e166ae0ecdf0042b"} err="failed to get container status \"6fe950d6a876a8f391802b5d4708febfa4e17565eada3cc1e166ae0ecdf0042b\": rpc error: code = NotFound desc = could not find container \"6fe950d6a876a8f391802b5d4708febfa4e17565eada3cc1e166ae0ecdf0042b\": container with ID starting with 6fe950d6a876a8f391802b5d4708febfa4e17565eada3cc1e166ae0ecdf0042b not found: ID does not exist" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.642785 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.685465 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.685514 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpqdx\" (UniqueName: \"kubernetes.io/projected/7c7c7644-0241-4145-b369-89153413fd39-kube-api-access-vpqdx\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.685545 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-public-tls-certs\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.685562 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-config-data\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.685578 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.685598 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-config-data\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.685623 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7c7644-0241-4145-b369-89153413fd39-logs\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.685660 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0925713-10df-46ab-b311-1a286b6c4515-run-httpd\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.685727 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.685748 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvgcr\" (UniqueName: \"kubernetes.io/projected/b0925713-10df-46ab-b311-1a286b6c4515-kube-api-access-fvgcr\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.685778 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.685809 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.685831 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-scripts\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.685855 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0925713-10df-46ab-b311-1a286b6c4515-log-httpd\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.686246 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0925713-10df-46ab-b311-1a286b6c4515-log-httpd\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.688515 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7c7644-0241-4145-b369-89153413fd39-logs\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.689002 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0925713-10df-46ab-b311-1a286b6c4515-run-httpd\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.700542 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-public-tls-certs\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.701057 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.701304 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.709443 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-scripts\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.717372 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.732408 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.732748 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-config-data\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.739486 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpqdx\" (UniqueName: \"kubernetes.io/projected/7c7c7644-0241-4145-b369-89153413fd39-kube-api-access-vpqdx\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.739510 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-config-data\") pod \"nova-api-0\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.740097 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.749443 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvgcr\" (UniqueName: \"kubernetes.io/projected/b0925713-10df-46ab-b311-1a286b6c4515-kube-api-access-fvgcr\") pod \"ceilometer-0\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " pod="openstack/ceilometer-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.774827 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:37:20 crc kubenswrapper[4914]: I0130 21:37:20.857140 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:37:21 crc kubenswrapper[4914]: I0130 21:37:21.243465 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:37:21 crc kubenswrapper[4914]: I0130 21:37:21.315368 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7c7c7644-0241-4145-b369-89153413fd39","Type":"ContainerStarted","Data":"7ecf8538bc7ca57f118ed5b51d1623b10685517eb4f1c08872252f5cc069da32"} Jan 30 21:37:21 crc kubenswrapper[4914]: I0130 21:37:21.412844 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:37:21 crc kubenswrapper[4914]: I0130 21:37:21.837890 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9616e365-c35d-4adb-96e5-81c1e34c7068" path="/var/lib/kubelet/pods/9616e365-c35d-4adb-96e5-81c1e34c7068/volumes" Jan 30 21:37:21 crc kubenswrapper[4914]: I0130 21:37:21.838644 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8480b9-d379-498c-afd7-298048f61525" path="/var/lib/kubelet/pods/bb8480b9-d379-498c-afd7-298048f61525/volumes" Jan 30 21:37:22 crc kubenswrapper[4914]: I0130 21:37:22.332959 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7c7c7644-0241-4145-b369-89153413fd39","Type":"ContainerStarted","Data":"34c60ccbcc7e0c1ce94ec10e8570f0b59141b1cb5f9e8d5fa7776a47085fb442"} Jan 30 21:37:22 crc kubenswrapper[4914]: I0130 21:37:22.333322 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7c7c7644-0241-4145-b369-89153413fd39","Type":"ContainerStarted","Data":"9d1c27d0202abd058e3b759bd778378a8c8d0b54a3354eec8d00fb6d958ac2aa"} Jan 30 21:37:22 crc kubenswrapper[4914]: I0130 21:37:22.334198 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0925713-10df-46ab-b311-1a286b6c4515","Type":"ContainerStarted","Data":"3256250860d0a832432bfe7e47c29f9ca08c6d2f59cbb94c6f5922bddef2c3b7"} Jan 30 21:37:22 crc kubenswrapper[4914]: I0130 21:37:22.367126 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.366992906 podStartE2EDuration="2.366992906s" podCreationTimestamp="2026-01-30 21:37:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:22.35449863 +0000 UTC m=+1375.793135391" watchObservedRunningTime="2026-01-30 21:37:22.366992906 +0000 UTC m=+1375.805629667" Jan 30 21:37:23 crc kubenswrapper[4914]: I0130 21:37:23.344351 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0925713-10df-46ab-b311-1a286b6c4515","Type":"ContainerStarted","Data":"c137c0d9d4c34a980d068a6d9afe97ff7d2ff7fb6e8368d990c2a0b69ae4356a"} Jan 30 21:37:23 crc kubenswrapper[4914]: I0130 21:37:23.600755 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:23 crc kubenswrapper[4914]: I0130 21:37:23.629849 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:23 crc kubenswrapper[4914]: I0130 21:37:23.711608 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:37:23 crc kubenswrapper[4914]: I0130 21:37:23.789181 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-jkq7z"] Jan 30 21:37:23 crc kubenswrapper[4914]: I0130 21:37:23.789444 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cd565959-jkq7z" podUID="ba416ff3-6b9e-42c4-bf91-582be4df1ed1" containerName="dnsmasq-dns" containerID="cri-o://83bb224c73761434b974004f5d068b75a0e7edaf414ade541058e1feb0c0ee53" gracePeriod=10 Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.379598 4914 generic.go:334] "Generic (PLEG): container finished" podID="ba416ff3-6b9e-42c4-bf91-582be4df1ed1" containerID="83bb224c73761434b974004f5d068b75a0e7edaf414ade541058e1feb0c0ee53" exitCode=0 Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.381118 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-jkq7z" event={"ID":"ba416ff3-6b9e-42c4-bf91-582be4df1ed1","Type":"ContainerDied","Data":"83bb224c73761434b974004f5d068b75a0e7edaf414ade541058e1feb0c0ee53"} Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.399978 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.569784 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-rtrh8"] Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.571297 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.573968 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.574162 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.580246 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-rtrh8"] Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.705541 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-config-data\") pod \"nova-cell1-cell-mapping-rtrh8\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.705834 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rtrh8\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.705911 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvkpt\" (UniqueName: \"kubernetes.io/projected/62190399-de6a-48f4-ba20-2458736b5b37-kube-api-access-cvkpt\") pod \"nova-cell1-cell-mapping-rtrh8\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.705969 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-scripts\") pod \"nova-cell1-cell-mapping-rtrh8\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.807761 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-scripts\") pod \"nova-cell1-cell-mapping-rtrh8\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.807887 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-config-data\") pod \"nova-cell1-cell-mapping-rtrh8\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.807924 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rtrh8\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.807984 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvkpt\" (UniqueName: \"kubernetes.io/projected/62190399-de6a-48f4-ba20-2458736b5b37-kube-api-access-cvkpt\") pod \"nova-cell1-cell-mapping-rtrh8\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.816487 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rtrh8\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.816494 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-scripts\") pod \"nova-cell1-cell-mapping-rtrh8\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.816536 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-config-data\") pod \"nova-cell1-cell-mapping-rtrh8\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.830171 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvkpt\" (UniqueName: \"kubernetes.io/projected/62190399-de6a-48f4-ba20-2458736b5b37-kube-api-access-cvkpt\") pod \"nova-cell1-cell-mapping-rtrh8\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.892245 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:37:24 crc kubenswrapper[4914]: I0130 21:37:24.892260 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.010766 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-config\") pod \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.011088 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-ovsdbserver-sb\") pod \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.011109 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xgjz\" (UniqueName: \"kubernetes.io/projected/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-kube-api-access-5xgjz\") pod \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.011464 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-dns-swift-storage-0\") pod \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.011545 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-ovsdbserver-nb\") pod \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.011659 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-dns-svc\") pod \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\" (UID: \"ba416ff3-6b9e-42c4-bf91-582be4df1ed1\") " Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.023066 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-kube-api-access-5xgjz" (OuterVolumeSpecName: "kube-api-access-5xgjz") pod "ba416ff3-6b9e-42c4-bf91-582be4df1ed1" (UID: "ba416ff3-6b9e-42c4-bf91-582be4df1ed1"). InnerVolumeSpecName "kube-api-access-5xgjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.088934 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ba416ff3-6b9e-42c4-bf91-582be4df1ed1" (UID: "ba416ff3-6b9e-42c4-bf91-582be4df1ed1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.124739 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xgjz\" (UniqueName: \"kubernetes.io/projected/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-kube-api-access-5xgjz\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.124768 4914 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.142542 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-config" (OuterVolumeSpecName: "config") pod "ba416ff3-6b9e-42c4-bf91-582be4df1ed1" (UID: "ba416ff3-6b9e-42c4-bf91-582be4df1ed1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.154497 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba416ff3-6b9e-42c4-bf91-582be4df1ed1" (UID: "ba416ff3-6b9e-42c4-bf91-582be4df1ed1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.163935 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba416ff3-6b9e-42c4-bf91-582be4df1ed1" (UID: "ba416ff3-6b9e-42c4-bf91-582be4df1ed1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.221318 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba416ff3-6b9e-42c4-bf91-582be4df1ed1" (UID: "ba416ff3-6b9e-42c4-bf91-582be4df1ed1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.226098 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.226136 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.226146 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.226155 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba416ff3-6b9e-42c4-bf91-582be4df1ed1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.391910 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0925713-10df-46ab-b311-1a286b6c4515","Type":"ContainerStarted","Data":"2d18ebea566347b4834ffdf0dd7c1479c533c39dc9ed79e4f0abfc93cf843c54"} Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.395668 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-jkq7z" event={"ID":"ba416ff3-6b9e-42c4-bf91-582be4df1ed1","Type":"ContainerDied","Data":"a388ee1ca194c1117d792ee610a5ec86f595666d6cacf1ed9819f63a5cc86e76"} Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.395698 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-jkq7z" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.395999 4914 scope.go:117] "RemoveContainer" containerID="83bb224c73761434b974004f5d068b75a0e7edaf414ade541058e1feb0c0ee53" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.429659 4914 scope.go:117] "RemoveContainer" containerID="2bc925bd362ee7effeda0ca668fbfbf0d9b1efb50278fac7833e2515aee512d2" Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.439761 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-rtrh8"] Jan 30 21:37:25 crc kubenswrapper[4914]: W0130 21:37:25.444299 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62190399_de6a_48f4_ba20_2458736b5b37.slice/crio-3e3743c8a6319905740a1dd6261197a4f0c887a84cbe7263db26d0e2b2c153b0 WatchSource:0}: Error finding container 3e3743c8a6319905740a1dd6261197a4f0c887a84cbe7263db26d0e2b2c153b0: Status 404 returned error can't find the container with id 3e3743c8a6319905740a1dd6261197a4f0c887a84cbe7263db26d0e2b2c153b0 Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.452842 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-jkq7z"] Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.472619 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-jkq7z"] Jan 30 21:37:25 crc kubenswrapper[4914]: I0130 21:37:25.828659 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba416ff3-6b9e-42c4-bf91-582be4df1ed1" path="/var/lib/kubelet/pods/ba416ff3-6b9e-42c4-bf91-582be4df1ed1/volumes" Jan 30 21:37:26 crc kubenswrapper[4914]: I0130 21:37:26.408147 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rtrh8" event={"ID":"62190399-de6a-48f4-ba20-2458736b5b37","Type":"ContainerStarted","Data":"3d9580134d260a1fdfdbba2e135073cb383e926e2e20e883e77c8ea6e25dc6c8"} Jan 30 21:37:26 crc kubenswrapper[4914]: I0130 21:37:26.408493 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rtrh8" event={"ID":"62190399-de6a-48f4-ba20-2458736b5b37","Type":"ContainerStarted","Data":"3e3743c8a6319905740a1dd6261197a4f0c887a84cbe7263db26d0e2b2c153b0"} Jan 30 21:37:26 crc kubenswrapper[4914]: I0130 21:37:26.427331 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-rtrh8" podStartSLOduration=2.4273104229999998 podStartE2EDuration="2.427310423s" podCreationTimestamp="2026-01-30 21:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:26.419558374 +0000 UTC m=+1379.858195135" watchObservedRunningTime="2026-01-30 21:37:26.427310423 +0000 UTC m=+1379.865947194" Jan 30 21:37:27 crc kubenswrapper[4914]: I0130 21:37:27.423205 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0925713-10df-46ab-b311-1a286b6c4515","Type":"ContainerStarted","Data":"637173705c6758c96e2f74e123347162e6406afd2ca310ef807710a339a2d267"} Jan 30 21:37:29 crc kubenswrapper[4914]: I0130 21:37:29.463344 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0925713-10df-46ab-b311-1a286b6c4515","Type":"ContainerStarted","Data":"a5c40f88a01bdeebb7910835c95d0ff73d2755e653b1ae63eb411c275266e855"} Jan 30 21:37:29 crc kubenswrapper[4914]: I0130 21:37:29.464107 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 21:37:29 crc kubenswrapper[4914]: I0130 21:37:29.491658 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.103088694 podStartE2EDuration="9.491631733s" podCreationTimestamp="2026-01-30 21:37:20 +0000 UTC" firstStartedPulling="2026-01-30 21:37:21.427798926 +0000 UTC m=+1374.866435697" lastFinishedPulling="2026-01-30 21:37:28.816341975 +0000 UTC m=+1382.254978736" observedRunningTime="2026-01-30 21:37:29.488298222 +0000 UTC m=+1382.926934983" watchObservedRunningTime="2026-01-30 21:37:29.491631733 +0000 UTC m=+1382.930268504" Jan 30 21:37:30 crc kubenswrapper[4914]: I0130 21:37:30.775832 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 21:37:30 crc kubenswrapper[4914]: I0130 21:37:30.775877 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 21:37:31 crc kubenswrapper[4914]: I0130 21:37:31.482355 4914 generic.go:334] "Generic (PLEG): container finished" podID="62190399-de6a-48f4-ba20-2458736b5b37" containerID="3d9580134d260a1fdfdbba2e135073cb383e926e2e20e883e77c8ea6e25dc6c8" exitCode=0 Jan 30 21:37:31 crc kubenswrapper[4914]: I0130 21:37:31.482397 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rtrh8" event={"ID":"62190399-de6a-48f4-ba20-2458736b5b37","Type":"ContainerDied","Data":"3d9580134d260a1fdfdbba2e135073cb383e926e2e20e883e77c8ea6e25dc6c8"} Jan 30 21:37:31 crc kubenswrapper[4914]: I0130 21:37:31.801875 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7c7c7644-0241-4145-b369-89153413fd39" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.225:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:37:31 crc kubenswrapper[4914]: I0130 21:37:31.802191 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7c7c7644-0241-4145-b369-89153413fd39" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.225:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:37:32 crc kubenswrapper[4914]: I0130 21:37:32.962465 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.122473 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-scripts\") pod \"62190399-de6a-48f4-ba20-2458736b5b37\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.122738 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-combined-ca-bundle\") pod \"62190399-de6a-48f4-ba20-2458736b5b37\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.122827 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvkpt\" (UniqueName: \"kubernetes.io/projected/62190399-de6a-48f4-ba20-2458736b5b37-kube-api-access-cvkpt\") pod \"62190399-de6a-48f4-ba20-2458736b5b37\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.122911 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-config-data\") pod \"62190399-de6a-48f4-ba20-2458736b5b37\" (UID: \"62190399-de6a-48f4-ba20-2458736b5b37\") " Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.132544 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-scripts" (OuterVolumeSpecName: "scripts") pod "62190399-de6a-48f4-ba20-2458736b5b37" (UID: "62190399-de6a-48f4-ba20-2458736b5b37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.133160 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62190399-de6a-48f4-ba20-2458736b5b37-kube-api-access-cvkpt" (OuterVolumeSpecName: "kube-api-access-cvkpt") pod "62190399-de6a-48f4-ba20-2458736b5b37" (UID: "62190399-de6a-48f4-ba20-2458736b5b37"). InnerVolumeSpecName "kube-api-access-cvkpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.167198 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-config-data" (OuterVolumeSpecName: "config-data") pod "62190399-de6a-48f4-ba20-2458736b5b37" (UID: "62190399-de6a-48f4-ba20-2458736b5b37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.170921 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62190399-de6a-48f4-ba20-2458736b5b37" (UID: "62190399-de6a-48f4-ba20-2458736b5b37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.225200 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.225236 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvkpt\" (UniqueName: \"kubernetes.io/projected/62190399-de6a-48f4-ba20-2458736b5b37-kube-api-access-cvkpt\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.225253 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.225265 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62190399-de6a-48f4-ba20-2458736b5b37-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.499766 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rtrh8" event={"ID":"62190399-de6a-48f4-ba20-2458736b5b37","Type":"ContainerDied","Data":"3e3743c8a6319905740a1dd6261197a4f0c887a84cbe7263db26d0e2b2c153b0"} Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.499806 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e3743c8a6319905740a1dd6261197a4f0c887a84cbe7263db26d0e2b2c153b0" Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.499836 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rtrh8" Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.678076 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.678536 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7c7c7644-0241-4145-b369-89153413fd39" containerName="nova-api-api" containerID="cri-o://34c60ccbcc7e0c1ce94ec10e8570f0b59141b1cb5f9e8d5fa7776a47085fb442" gracePeriod=30 Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.678363 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7c7c7644-0241-4145-b369-89153413fd39" containerName="nova-api-log" containerID="cri-o://9d1c27d0202abd058e3b759bd778378a8c8d0b54a3354eec8d00fb6d958ac2aa" gracePeriod=30 Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.738051 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.738310 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6009160e-a137-406a-9993-ce86e2236110" containerName="nova-metadata-log" containerID="cri-o://f1fb98fe4f62ea81fa973b4c031a0dd2f70fe282ea0ccd01772ab33e5481b60d" gracePeriod=30 Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.738747 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6009160e-a137-406a-9993-ce86e2236110" containerName="nova-metadata-metadata" containerID="cri-o://8d450d88b86419d8b0697a01ac8e68ed82276acc2ce676d606b984494f77c96a" gracePeriod=30 Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.753318 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:37:33 crc kubenswrapper[4914]: I0130 21:37:33.753569 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fcf18dbe-0b07-4ae2-8398-c29fe48daaff" containerName="nova-scheduler-scheduler" containerID="cri-o://fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523" gracePeriod=30 Jan 30 21:37:34 crc kubenswrapper[4914]: I0130 21:37:34.525012 4914 generic.go:334] "Generic (PLEG): container finished" podID="6009160e-a137-406a-9993-ce86e2236110" containerID="f1fb98fe4f62ea81fa973b4c031a0dd2f70fe282ea0ccd01772ab33e5481b60d" exitCode=143 Jan 30 21:37:34 crc kubenswrapper[4914]: I0130 21:37:34.525065 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6009160e-a137-406a-9993-ce86e2236110","Type":"ContainerDied","Data":"f1fb98fe4f62ea81fa973b4c031a0dd2f70fe282ea0ccd01772ab33e5481b60d"} Jan 30 21:37:34 crc kubenswrapper[4914]: I0130 21:37:34.528425 4914 generic.go:334] "Generic (PLEG): container finished" podID="7c7c7644-0241-4145-b369-89153413fd39" containerID="9d1c27d0202abd058e3b759bd778378a8c8d0b54a3354eec8d00fb6d958ac2aa" exitCode=143 Jan 30 21:37:34 crc kubenswrapper[4914]: I0130 21:37:34.528463 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7c7c7644-0241-4145-b369-89153413fd39","Type":"ContainerDied","Data":"9d1c27d0202abd058e3b759bd778378a8c8d0b54a3354eec8d00fb6d958ac2aa"} Jan 30 21:37:36 crc kubenswrapper[4914]: I0130 21:37:36.908923 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6009160e-a137-406a-9993-ce86e2236110" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": read tcp 10.217.0.2:59582->10.217.0.220:8775: read: connection reset by peer" Jan 30 21:37:36 crc kubenswrapper[4914]: I0130 21:37:36.908928 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6009160e-a137-406a-9993-ce86e2236110" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": read tcp 10.217.0.2:59592->10.217.0.220:8775: read: connection reset by peer" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.442447 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.562128 4914 generic.go:334] "Generic (PLEG): container finished" podID="6009160e-a137-406a-9993-ce86e2236110" containerID="8d450d88b86419d8b0697a01ac8e68ed82276acc2ce676d606b984494f77c96a" exitCode=0 Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.562275 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6009160e-a137-406a-9993-ce86e2236110","Type":"ContainerDied","Data":"8d450d88b86419d8b0697a01ac8e68ed82276acc2ce676d606b984494f77c96a"} Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.564717 4914 generic.go:334] "Generic (PLEG): container finished" podID="7c7c7644-0241-4145-b369-89153413fd39" containerID="34c60ccbcc7e0c1ce94ec10e8570f0b59141b1cb5f9e8d5fa7776a47085fb442" exitCode=0 Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.564763 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7c7c7644-0241-4145-b369-89153413fd39","Type":"ContainerDied","Data":"34c60ccbcc7e0c1ce94ec10e8570f0b59141b1cb5f9e8d5fa7776a47085fb442"} Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.564791 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7c7c7644-0241-4145-b369-89153413fd39","Type":"ContainerDied","Data":"7ecf8538bc7ca57f118ed5b51d1623b10685517eb4f1c08872252f5cc069da32"} Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.564810 4914 scope.go:117] "RemoveContainer" containerID="34c60ccbcc7e0c1ce94ec10e8570f0b59141b1cb5f9e8d5fa7776a47085fb442" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.565041 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.616770 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7c7644-0241-4145-b369-89153413fd39-logs\") pod \"7c7c7644-0241-4145-b369-89153413fd39\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.616896 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-public-tls-certs\") pod \"7c7c7644-0241-4145-b369-89153413fd39\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.616960 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-config-data\") pod \"7c7c7644-0241-4145-b369-89153413fd39\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.617026 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpqdx\" (UniqueName: \"kubernetes.io/projected/7c7c7644-0241-4145-b369-89153413fd39-kube-api-access-vpqdx\") pod \"7c7c7644-0241-4145-b369-89153413fd39\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.617054 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-combined-ca-bundle\") pod \"7c7c7644-0241-4145-b369-89153413fd39\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.617194 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-internal-tls-certs\") pod \"7c7c7644-0241-4145-b369-89153413fd39\" (UID: \"7c7c7644-0241-4145-b369-89153413fd39\") " Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.617577 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c7c7644-0241-4145-b369-89153413fd39-logs" (OuterVolumeSpecName: "logs") pod "7c7c7644-0241-4145-b369-89153413fd39" (UID: "7c7c7644-0241-4145-b369-89153413fd39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.618066 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7c7644-0241-4145-b369-89153413fd39-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.622616 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c7c7644-0241-4145-b369-89153413fd39-kube-api-access-vpqdx" (OuterVolumeSpecName: "kube-api-access-vpqdx") pod "7c7c7644-0241-4145-b369-89153413fd39" (UID: "7c7c7644-0241-4145-b369-89153413fd39"). InnerVolumeSpecName "kube-api-access-vpqdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.644603 4914 scope.go:117] "RemoveContainer" containerID="9d1c27d0202abd058e3b759bd778378a8c8d0b54a3354eec8d00fb6d958ac2aa" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.647098 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-config-data" (OuterVolumeSpecName: "config-data") pod "7c7c7644-0241-4145-b369-89153413fd39" (UID: "7c7c7644-0241-4145-b369-89153413fd39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.651808 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c7c7644-0241-4145-b369-89153413fd39" (UID: "7c7c7644-0241-4145-b369-89153413fd39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.675378 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7c7c7644-0241-4145-b369-89153413fd39" (UID: "7c7c7644-0241-4145-b369-89153413fd39"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.678011 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7c7c7644-0241-4145-b369-89153413fd39" (UID: "7c7c7644-0241-4145-b369-89153413fd39"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.720086 4914 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.720404 4914 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.720413 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.720422 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpqdx\" (UniqueName: \"kubernetes.io/projected/7c7c7644-0241-4145-b369-89153413fd39-kube-api-access-vpqdx\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.720433 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c7c7644-0241-4145-b369-89153413fd39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.725795 4914 scope.go:117] "RemoveContainer" containerID="34c60ccbcc7e0c1ce94ec10e8570f0b59141b1cb5f9e8d5fa7776a47085fb442" Jan 30 21:37:37 crc kubenswrapper[4914]: E0130 21:37:37.726236 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34c60ccbcc7e0c1ce94ec10e8570f0b59141b1cb5f9e8d5fa7776a47085fb442\": container with ID starting with 34c60ccbcc7e0c1ce94ec10e8570f0b59141b1cb5f9e8d5fa7776a47085fb442 not found: ID does not exist" containerID="34c60ccbcc7e0c1ce94ec10e8570f0b59141b1cb5f9e8d5fa7776a47085fb442" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.726279 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34c60ccbcc7e0c1ce94ec10e8570f0b59141b1cb5f9e8d5fa7776a47085fb442"} err="failed to get container status \"34c60ccbcc7e0c1ce94ec10e8570f0b59141b1cb5f9e8d5fa7776a47085fb442\": rpc error: code = NotFound desc = could not find container \"34c60ccbcc7e0c1ce94ec10e8570f0b59141b1cb5f9e8d5fa7776a47085fb442\": container with ID starting with 34c60ccbcc7e0c1ce94ec10e8570f0b59141b1cb5f9e8d5fa7776a47085fb442 not found: ID does not exist" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.726304 4914 scope.go:117] "RemoveContainer" containerID="9d1c27d0202abd058e3b759bd778378a8c8d0b54a3354eec8d00fb6d958ac2aa" Jan 30 21:37:37 crc kubenswrapper[4914]: E0130 21:37:37.727780 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d1c27d0202abd058e3b759bd778378a8c8d0b54a3354eec8d00fb6d958ac2aa\": container with ID starting with 9d1c27d0202abd058e3b759bd778378a8c8d0b54a3354eec8d00fb6d958ac2aa not found: ID does not exist" containerID="9d1c27d0202abd058e3b759bd778378a8c8d0b54a3354eec8d00fb6d958ac2aa" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.727821 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1c27d0202abd058e3b759bd778378a8c8d0b54a3354eec8d00fb6d958ac2aa"} err="failed to get container status \"9d1c27d0202abd058e3b759bd778378a8c8d0b54a3354eec8d00fb6d958ac2aa\": rpc error: code = NotFound desc = could not find container \"9d1c27d0202abd058e3b759bd778378a8c8d0b54a3354eec8d00fb6d958ac2aa\": container with ID starting with 9d1c27d0202abd058e3b759bd778378a8c8d0b54a3354eec8d00fb6d958ac2aa not found: ID does not exist" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.887433 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.902795 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.909961 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 21:37:37 crc kubenswrapper[4914]: E0130 21:37:37.910340 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba416ff3-6b9e-42c4-bf91-582be4df1ed1" containerName="init" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.910351 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba416ff3-6b9e-42c4-bf91-582be4df1ed1" containerName="init" Jan 30 21:37:37 crc kubenswrapper[4914]: E0130 21:37:37.910376 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba416ff3-6b9e-42c4-bf91-582be4df1ed1" containerName="dnsmasq-dns" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.910381 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba416ff3-6b9e-42c4-bf91-582be4df1ed1" containerName="dnsmasq-dns" Jan 30 21:37:37 crc kubenswrapper[4914]: E0130 21:37:37.910394 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7c7644-0241-4145-b369-89153413fd39" containerName="nova-api-api" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.910400 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7c7644-0241-4145-b369-89153413fd39" containerName="nova-api-api" Jan 30 21:37:37 crc kubenswrapper[4914]: E0130 21:37:37.910419 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7c7644-0241-4145-b369-89153413fd39" containerName="nova-api-log" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.910424 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7c7644-0241-4145-b369-89153413fd39" containerName="nova-api-log" Jan 30 21:37:37 crc kubenswrapper[4914]: E0130 21:37:37.910436 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62190399-de6a-48f4-ba20-2458736b5b37" containerName="nova-manage" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.910442 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="62190399-de6a-48f4-ba20-2458736b5b37" containerName="nova-manage" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.910626 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c7c7644-0241-4145-b369-89153413fd39" containerName="nova-api-log" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.910641 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="62190399-de6a-48f4-ba20-2458736b5b37" containerName="nova-manage" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.910648 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba416ff3-6b9e-42c4-bf91-582be4df1ed1" containerName="dnsmasq-dns" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.910658 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c7c7644-0241-4145-b369-89153413fd39" containerName="nova-api-api" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.911641 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.915033 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.915260 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.915801 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 21:37:37 crc kubenswrapper[4914]: I0130 21:37:37.945612 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.025773 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7mpl\" (UniqueName: \"kubernetes.io/projected/25d6bd81-138b-48c0-8a75-586bb6489321-kube-api-access-t7mpl\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.025861 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d6bd81-138b-48c0-8a75-586bb6489321-config-data\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.025926 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d6bd81-138b-48c0-8a75-586bb6489321-logs\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.025949 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d6bd81-138b-48c0-8a75-586bb6489321-public-tls-certs\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.026043 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d6bd81-138b-48c0-8a75-586bb6489321-internal-tls-certs\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.026168 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d6bd81-138b-48c0-8a75-586bb6489321-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.063414 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.128083 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d6bd81-138b-48c0-8a75-586bb6489321-internal-tls-certs\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.128249 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d6bd81-138b-48c0-8a75-586bb6489321-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.128344 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7mpl\" (UniqueName: \"kubernetes.io/projected/25d6bd81-138b-48c0-8a75-586bb6489321-kube-api-access-t7mpl\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.128387 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d6bd81-138b-48c0-8a75-586bb6489321-config-data\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.128424 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d6bd81-138b-48c0-8a75-586bb6489321-logs\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.128448 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d6bd81-138b-48c0-8a75-586bb6489321-public-tls-certs\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.129605 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d6bd81-138b-48c0-8a75-586bb6489321-logs\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.132414 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d6bd81-138b-48c0-8a75-586bb6489321-public-tls-certs\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.136367 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d6bd81-138b-48c0-8a75-586bb6489321-config-data\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.138482 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d6bd81-138b-48c0-8a75-586bb6489321-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.156781 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d6bd81-138b-48c0-8a75-586bb6489321-internal-tls-certs\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.157360 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7mpl\" (UniqueName: \"kubernetes.io/projected/25d6bd81-138b-48c0-8a75-586bb6489321-kube-api-access-t7mpl\") pod \"nova-api-0\" (UID: \"25d6bd81-138b-48c0-8a75-586bb6489321\") " pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.230208 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-combined-ca-bundle\") pod \"6009160e-a137-406a-9993-ce86e2236110\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.230372 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6009160e-a137-406a-9993-ce86e2236110-logs\") pod \"6009160e-a137-406a-9993-ce86e2236110\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.230483 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9qhz\" (UniqueName: \"kubernetes.io/projected/6009160e-a137-406a-9993-ce86e2236110-kube-api-access-q9qhz\") pod \"6009160e-a137-406a-9993-ce86e2236110\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.230541 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-nova-metadata-tls-certs\") pod \"6009160e-a137-406a-9993-ce86e2236110\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.230595 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-config-data\") pod \"6009160e-a137-406a-9993-ce86e2236110\" (UID: \"6009160e-a137-406a-9993-ce86e2236110\") " Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.233449 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6009160e-a137-406a-9993-ce86e2236110-logs" (OuterVolumeSpecName: "logs") pod "6009160e-a137-406a-9993-ce86e2236110" (UID: "6009160e-a137-406a-9993-ce86e2236110"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.238467 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6009160e-a137-406a-9993-ce86e2236110-kube-api-access-q9qhz" (OuterVolumeSpecName: "kube-api-access-q9qhz") pod "6009160e-a137-406a-9993-ce86e2236110" (UID: "6009160e-a137-406a-9993-ce86e2236110"). InnerVolumeSpecName "kube-api-access-q9qhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4914]: E0130 21:37:38.289477 4914 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523 is running failed: container process not found" containerID="fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.289600 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6009160e-a137-406a-9993-ce86e2236110" (UID: "6009160e-a137-406a-9993-ce86e2236110"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.289839 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-config-data" (OuterVolumeSpecName: "config-data") pod "6009160e-a137-406a-9993-ce86e2236110" (UID: "6009160e-a137-406a-9993-ce86e2236110"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4914]: E0130 21:37:38.290292 4914 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523 is running failed: container process not found" containerID="fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 21:37:38 crc kubenswrapper[4914]: E0130 21:37:38.290760 4914 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523 is running failed: container process not found" containerID="fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 30 21:37:38 crc kubenswrapper[4914]: E0130 21:37:38.290840 4914 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="fcf18dbe-0b07-4ae2-8398-c29fe48daaff" containerName="nova-scheduler-scheduler" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.314330 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6009160e-a137-406a-9993-ce86e2236110" (UID: "6009160e-a137-406a-9993-ce86e2236110"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.333230 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.333273 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.333286 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6009160e-a137-406a-9993-ce86e2236110-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.333298 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9qhz\" (UniqueName: \"kubernetes.io/projected/6009160e-a137-406a-9993-ce86e2236110-kube-api-access-q9qhz\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.333310 4914 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6009160e-a137-406a-9993-ce86e2236110-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.360123 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.529424 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.576676 4914 generic.go:334] "Generic (PLEG): container finished" podID="fcf18dbe-0b07-4ae2-8398-c29fe48daaff" containerID="fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523" exitCode=0 Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.576831 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcf18dbe-0b07-4ae2-8398-c29fe48daaff","Type":"ContainerDied","Data":"fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523"} Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.576865 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcf18dbe-0b07-4ae2-8398-c29fe48daaff","Type":"ContainerDied","Data":"0cd023662c5b3a5f143c1787ebae0636c02174c80f655809b7e398c9a658295d"} Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.576886 4914 scope.go:117] "RemoveContainer" containerID="fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.576992 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.582650 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.582747 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6009160e-a137-406a-9993-ce86e2236110","Type":"ContainerDied","Data":"2f8f319e2c738bdeeead6bf9429cef1d0d392a9f68b34cd51da48e80b2e877ee"} Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.613160 4914 scope.go:117] "RemoveContainer" containerID="fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523" Jan 30 21:37:38 crc kubenswrapper[4914]: E0130 21:37:38.613543 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523\": container with ID starting with fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523 not found: ID does not exist" containerID="fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.613589 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523"} err="failed to get container status \"fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523\": rpc error: code = NotFound desc = could not find container \"fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523\": container with ID starting with fd7a257ba9e5864b3f50e5c5f924c48ccc03b454b5b9d95fabff53bf5ac28523 not found: ID does not exist" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.613622 4914 scope.go:117] "RemoveContainer" containerID="8d450d88b86419d8b0697a01ac8e68ed82276acc2ce676d606b984494f77c96a" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.638501 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-combined-ca-bundle\") pod \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\" (UID: \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\") " Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.638850 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-config-data\") pod \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\" (UID: \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\") " Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.638928 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5fkj\" (UniqueName: \"kubernetes.io/projected/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-kube-api-access-l5fkj\") pod \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\" (UID: \"fcf18dbe-0b07-4ae2-8398-c29fe48daaff\") " Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.653233 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-kube-api-access-l5fkj" (OuterVolumeSpecName: "kube-api-access-l5fkj") pod "fcf18dbe-0b07-4ae2-8398-c29fe48daaff" (UID: "fcf18dbe-0b07-4ae2-8398-c29fe48daaff"). InnerVolumeSpecName "kube-api-access-l5fkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.656095 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.672617 4914 scope.go:117] "RemoveContainer" containerID="f1fb98fe4f62ea81fa973b4c031a0dd2f70fe282ea0ccd01772ab33e5481b60d" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.673899 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.698025 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:37:38 crc kubenswrapper[4914]: E0130 21:37:38.698428 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6009160e-a137-406a-9993-ce86e2236110" containerName="nova-metadata-log" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.698443 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6009160e-a137-406a-9993-ce86e2236110" containerName="nova-metadata-log" Jan 30 21:37:38 crc kubenswrapper[4914]: E0130 21:37:38.698458 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf18dbe-0b07-4ae2-8398-c29fe48daaff" containerName="nova-scheduler-scheduler" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.698464 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf18dbe-0b07-4ae2-8398-c29fe48daaff" containerName="nova-scheduler-scheduler" Jan 30 21:37:38 crc kubenswrapper[4914]: E0130 21:37:38.698493 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6009160e-a137-406a-9993-ce86e2236110" containerName="nova-metadata-metadata" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.698499 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="6009160e-a137-406a-9993-ce86e2236110" containerName="nova-metadata-metadata" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.698722 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6009160e-a137-406a-9993-ce86e2236110" containerName="nova-metadata-metadata" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.698745 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="6009160e-a137-406a-9993-ce86e2236110" containerName="nova-metadata-log" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.698762 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf18dbe-0b07-4ae2-8398-c29fe48daaff" containerName="nova-scheduler-scheduler" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.699687 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-config-data" (OuterVolumeSpecName: "config-data") pod "fcf18dbe-0b07-4ae2-8398-c29fe48daaff" (UID: "fcf18dbe-0b07-4ae2-8398-c29fe48daaff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.699839 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.703320 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.707458 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.707910 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.738206 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcf18dbe-0b07-4ae2-8398-c29fe48daaff" (UID: "fcf18dbe-0b07-4ae2-8398-c29fe48daaff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.741542 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.741568 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5fkj\" (UniqueName: \"kubernetes.io/projected/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-kube-api-access-l5fkj\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.741577 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf18dbe-0b07-4ae2-8398-c29fe48daaff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.859456 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.859605 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gpls\" (UniqueName: \"kubernetes.io/projected/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-kube-api-access-7gpls\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.859669 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-config-data\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.859724 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.859780 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-logs\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.966275 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.967863 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gpls\" (UniqueName: \"kubernetes.io/projected/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-kube-api-access-7gpls\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.967975 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-config-data\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.968066 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.968188 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-logs\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.973618 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-logs\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.980059 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:38 crc kubenswrapper[4914]: I0130 21:37:38.980674 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.016942 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gpls\" (UniqueName: \"kubernetes.io/projected/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-kube-api-access-7gpls\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.018490 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4d1d85-ceb5-43da-85f2-a3b8d39590ab-config-data\") pod \"nova-metadata-0\" (UID: \"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab\") " pod="openstack/nova-metadata-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.020482 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.038272 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.201292 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.235584 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.266063 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.269063 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.272348 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.305732 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.392864 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h4h6\" (UniqueName: \"kubernetes.io/projected/1cf3c517-6ee1-4af1-a62b-bf572596a05a-kube-api-access-2h4h6\") pod \"nova-scheduler-0\" (UID: \"1cf3c517-6ee1-4af1-a62b-bf572596a05a\") " pod="openstack/nova-scheduler-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.393039 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cf3c517-6ee1-4af1-a62b-bf572596a05a-config-data\") pod \"nova-scheduler-0\" (UID: \"1cf3c517-6ee1-4af1-a62b-bf572596a05a\") " pod="openstack/nova-scheduler-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.393094 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cf3c517-6ee1-4af1-a62b-bf572596a05a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1cf3c517-6ee1-4af1-a62b-bf572596a05a\") " pod="openstack/nova-scheduler-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.495417 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h4h6\" (UniqueName: \"kubernetes.io/projected/1cf3c517-6ee1-4af1-a62b-bf572596a05a-kube-api-access-2h4h6\") pod \"nova-scheduler-0\" (UID: \"1cf3c517-6ee1-4af1-a62b-bf572596a05a\") " pod="openstack/nova-scheduler-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.495519 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cf3c517-6ee1-4af1-a62b-bf572596a05a-config-data\") pod \"nova-scheduler-0\" (UID: \"1cf3c517-6ee1-4af1-a62b-bf572596a05a\") " pod="openstack/nova-scheduler-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.495567 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cf3c517-6ee1-4af1-a62b-bf572596a05a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1cf3c517-6ee1-4af1-a62b-bf572596a05a\") " pod="openstack/nova-scheduler-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.499491 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cf3c517-6ee1-4af1-a62b-bf572596a05a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1cf3c517-6ee1-4af1-a62b-bf572596a05a\") " pod="openstack/nova-scheduler-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.502742 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cf3c517-6ee1-4af1-a62b-bf572596a05a-config-data\") pod \"nova-scheduler-0\" (UID: \"1cf3c517-6ee1-4af1-a62b-bf572596a05a\") " pod="openstack/nova-scheduler-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.513825 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h4h6\" (UniqueName: \"kubernetes.io/projected/1cf3c517-6ee1-4af1-a62b-bf572596a05a-kube-api-access-2h4h6\") pod \"nova-scheduler-0\" (UID: \"1cf3c517-6ee1-4af1-a62b-bf572596a05a\") " pod="openstack/nova-scheduler-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.599335 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"25d6bd81-138b-48c0-8a75-586bb6489321","Type":"ContainerStarted","Data":"7fe01cbd5db567fdced5904dfb514532dfb8234ea57f9e7db5fc2313fb69f2c5"} Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.599652 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"25d6bd81-138b-48c0-8a75-586bb6489321","Type":"ContainerStarted","Data":"30cf3ffa03af293adcea7c9f4ba46b45bbffdefc340e3eb2db8b24c0fe7bf376"} Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.630242 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.661382 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.839118 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6009160e-a137-406a-9993-ce86e2236110" path="/var/lib/kubelet/pods/6009160e-a137-406a-9993-ce86e2236110/volumes" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.839802 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c7c7644-0241-4145-b369-89153413fd39" path="/var/lib/kubelet/pods/7c7c7644-0241-4145-b369-89153413fd39/volumes" Jan 30 21:37:39 crc kubenswrapper[4914]: I0130 21:37:39.840379 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf18dbe-0b07-4ae2-8398-c29fe48daaff" path="/var/lib/kubelet/pods/fcf18dbe-0b07-4ae2-8398-c29fe48daaff/volumes" Jan 30 21:37:40 crc kubenswrapper[4914]: I0130 21:37:40.122668 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:37:40 crc kubenswrapper[4914]: I0130 21:37:40.609396 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1cf3c517-6ee1-4af1-a62b-bf572596a05a","Type":"ContainerStarted","Data":"09c1278d07c70a9c1c20ab043ee1b6eca7c9887f3794a3f946779ec0c5ad6765"} Jan 30 21:37:40 crc kubenswrapper[4914]: I0130 21:37:40.609743 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1cf3c517-6ee1-4af1-a62b-bf572596a05a","Type":"ContainerStarted","Data":"43216f0e66d53d93dc8b7d6534783179020eb2497fb18510fe9088d1a32f6897"} Jan 30 21:37:40 crc kubenswrapper[4914]: I0130 21:37:40.611275 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"25d6bd81-138b-48c0-8a75-586bb6489321","Type":"ContainerStarted","Data":"c595e6b495499c122289704d105b2528a913cca7f73b73530f36024a3ec6f9ec"} Jan 30 21:37:40 crc kubenswrapper[4914]: I0130 21:37:40.613129 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab","Type":"ContainerStarted","Data":"2dad9288f4fa856283902e15935f2a3f65e84398e381f5b4813fdcb6ba04dc66"} Jan 30 21:37:40 crc kubenswrapper[4914]: I0130 21:37:40.613159 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab","Type":"ContainerStarted","Data":"ab1f8be70bbb852c5a57ff375a2df56c7f8880e4e7e58bc3615420cbe03c6a6b"} Jan 30 21:37:40 crc kubenswrapper[4914]: I0130 21:37:40.613171 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a4d1d85-ceb5-43da-85f2-a3b8d39590ab","Type":"ContainerStarted","Data":"091c9bd966f796657b07c51aae48987a8083b2dcc27315aa7555ae39d86e7ffd"} Jan 30 21:37:40 crc kubenswrapper[4914]: I0130 21:37:40.635916 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.6358847029999999 podStartE2EDuration="1.635884703s" podCreationTimestamp="2026-01-30 21:37:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:40.631629809 +0000 UTC m=+1394.070266570" watchObservedRunningTime="2026-01-30 21:37:40.635884703 +0000 UTC m=+1394.074521464" Jan 30 21:37:40 crc kubenswrapper[4914]: I0130 21:37:40.662506 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.662485663 podStartE2EDuration="3.662485663s" podCreationTimestamp="2026-01-30 21:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:40.652230312 +0000 UTC m=+1394.090867073" watchObservedRunningTime="2026-01-30 21:37:40.662485663 +0000 UTC m=+1394.101122424" Jan 30 21:37:40 crc kubenswrapper[4914]: I0130 21:37:40.679643 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.679621992 podStartE2EDuration="2.679621992s" podCreationTimestamp="2026-01-30 21:37:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:40.672896118 +0000 UTC m=+1394.111532889" watchObservedRunningTime="2026-01-30 21:37:40.679621992 +0000 UTC m=+1394.118258753" Jan 30 21:37:44 crc kubenswrapper[4914]: I0130 21:37:44.021578 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 21:37:44 crc kubenswrapper[4914]: I0130 21:37:44.022108 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 21:37:44 crc kubenswrapper[4914]: I0130 21:37:44.631823 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 21:37:48 crc kubenswrapper[4914]: I0130 21:37:48.360552 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 21:37:48 crc kubenswrapper[4914]: I0130 21:37:48.360923 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 21:37:49 crc kubenswrapper[4914]: I0130 21:37:49.021457 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 21:37:49 crc kubenswrapper[4914]: I0130 21:37:49.021802 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 21:37:49 crc kubenswrapper[4914]: I0130 21:37:49.378892 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="25d6bd81-138b-48c0-8a75-586bb6489321" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.228:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:37:49 crc kubenswrapper[4914]: I0130 21:37:49.378892 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="25d6bd81-138b-48c0-8a75-586bb6489321" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.228:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:37:49 crc kubenswrapper[4914]: I0130 21:37:49.631513 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 21:37:49 crc kubenswrapper[4914]: I0130 21:37:49.670455 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 21:37:49 crc kubenswrapper[4914]: I0130 21:37:49.749667 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 21:37:50 crc kubenswrapper[4914]: I0130 21:37:50.033899 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0a4d1d85-ceb5-43da-85f2-a3b8d39590ab" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.229:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:37:50 crc kubenswrapper[4914]: I0130 21:37:50.033893 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0a4d1d85-ceb5-43da-85f2-a3b8d39590ab" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.229:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:37:50 crc kubenswrapper[4914]: I0130 21:37:50.872871 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 21:37:56 crc kubenswrapper[4914]: I0130 21:37:56.983294 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:37:56 crc kubenswrapper[4914]: I0130 21:37:56.983754 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:37:58 crc kubenswrapper[4914]: I0130 21:37:58.367129 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 21:37:58 crc kubenswrapper[4914]: I0130 21:37:58.367744 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 21:37:58 crc kubenswrapper[4914]: I0130 21:37:58.370270 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 21:37:58 crc kubenswrapper[4914]: I0130 21:37:58.379966 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 21:37:58 crc kubenswrapper[4914]: I0130 21:37:58.825270 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 21:37:58 crc kubenswrapper[4914]: I0130 21:37:58.831347 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 21:37:59 crc kubenswrapper[4914]: I0130 21:37:59.033794 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 21:37:59 crc kubenswrapper[4914]: I0130 21:37:59.034329 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 21:37:59 crc kubenswrapper[4914]: I0130 21:37:59.038545 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 21:37:59 crc kubenswrapper[4914]: I0130 21:37:59.837416 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 21:37:59 crc kubenswrapper[4914]: I0130 21:37:59.988607 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7kk2g"] Jan 30 21:37:59 crc kubenswrapper[4914]: I0130 21:37:59.991479 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:38:00 crc kubenswrapper[4914]: I0130 21:38:00.000817 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7kk2g"] Jan 30 21:38:00 crc kubenswrapper[4914]: I0130 21:38:00.078902 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzcbr\" (UniqueName: \"kubernetes.io/projected/ae495b2b-b99b-4051-bd64-c54667d4d9bc-kube-api-access-lzcbr\") pod \"redhat-operators-7kk2g\" (UID: \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\") " pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:38:00 crc kubenswrapper[4914]: I0130 21:38:00.078954 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae495b2b-b99b-4051-bd64-c54667d4d9bc-utilities\") pod \"redhat-operators-7kk2g\" (UID: \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\") " pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:38:00 crc kubenswrapper[4914]: I0130 21:38:00.079125 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae495b2b-b99b-4051-bd64-c54667d4d9bc-catalog-content\") pod \"redhat-operators-7kk2g\" (UID: \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\") " pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:38:00 crc kubenswrapper[4914]: I0130 21:38:00.181270 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzcbr\" (UniqueName: \"kubernetes.io/projected/ae495b2b-b99b-4051-bd64-c54667d4d9bc-kube-api-access-lzcbr\") pod \"redhat-operators-7kk2g\" (UID: \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\") " pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:38:00 crc kubenswrapper[4914]: I0130 21:38:00.181358 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae495b2b-b99b-4051-bd64-c54667d4d9bc-utilities\") pod \"redhat-operators-7kk2g\" (UID: \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\") " pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:38:00 crc kubenswrapper[4914]: I0130 21:38:00.181443 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae495b2b-b99b-4051-bd64-c54667d4d9bc-catalog-content\") pod \"redhat-operators-7kk2g\" (UID: \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\") " pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:38:00 crc kubenswrapper[4914]: I0130 21:38:00.181902 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae495b2b-b99b-4051-bd64-c54667d4d9bc-utilities\") pod \"redhat-operators-7kk2g\" (UID: \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\") " pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:38:00 crc kubenswrapper[4914]: I0130 21:38:00.182222 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae495b2b-b99b-4051-bd64-c54667d4d9bc-catalog-content\") pod \"redhat-operators-7kk2g\" (UID: \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\") " pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:38:00 crc kubenswrapper[4914]: I0130 21:38:00.208625 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzcbr\" (UniqueName: \"kubernetes.io/projected/ae495b2b-b99b-4051-bd64-c54667d4d9bc-kube-api-access-lzcbr\") pod \"redhat-operators-7kk2g\" (UID: \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\") " pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:38:00 crc kubenswrapper[4914]: I0130 21:38:00.324836 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:38:01 crc kubenswrapper[4914]: I0130 21:38:01.414395 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7kk2g"] Jan 30 21:38:01 crc kubenswrapper[4914]: I0130 21:38:01.847969 4914 generic.go:334] "Generic (PLEG): container finished" podID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerID="1496e7b71ea268b10c71328e8c3cd10d6a54f8cedabb613af827c43f3aa97748" exitCode=0 Jan 30 21:38:01 crc kubenswrapper[4914]: I0130 21:38:01.848073 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kk2g" event={"ID":"ae495b2b-b99b-4051-bd64-c54667d4d9bc","Type":"ContainerDied","Data":"1496e7b71ea268b10c71328e8c3cd10d6a54f8cedabb613af827c43f3aa97748"} Jan 30 21:38:01 crc kubenswrapper[4914]: I0130 21:38:01.848270 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kk2g" event={"ID":"ae495b2b-b99b-4051-bd64-c54667d4d9bc","Type":"ContainerStarted","Data":"ecb965cdf9835ab877c786fec32f546a74d234cf7b7c51d52720619d4905dbf6"} Jan 30 21:38:02 crc kubenswrapper[4914]: I0130 21:38:02.858840 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kk2g" event={"ID":"ae495b2b-b99b-4051-bd64-c54667d4d9bc","Type":"ContainerStarted","Data":"3fc7c6999454190607a44a3782b41950bd59fa77c68315821e44554b60bc5da1"} Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.374540 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-t4kd2"] Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.385511 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-t4kd2"] Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.493974 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-f9bxt"] Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.495616 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.498214 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.516907 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-f9bxt"] Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.573986 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjq6m\" (UniqueName: \"kubernetes.io/projected/8ce3d73e-f519-423b-81c0-d120a416b488-kube-api-access-cjq6m\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.574052 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-scripts\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.574121 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-config-data\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.574135 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ce3d73e-f519-423b-81c0-d120a416b488-certs\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.574183 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-combined-ca-bundle\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.675411 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-config-data\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.675450 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ce3d73e-f519-423b-81c0-d120a416b488-certs\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.675503 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-combined-ca-bundle\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.675602 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjq6m\" (UniqueName: \"kubernetes.io/projected/8ce3d73e-f519-423b-81c0-d120a416b488-kube-api-access-cjq6m\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.675637 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-scripts\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.685489 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-scripts\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.685656 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ce3d73e-f519-423b-81c0-d120a416b488-certs\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.687831 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-config-data\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.689382 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-combined-ca-bundle\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.700226 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjq6m\" (UniqueName: \"kubernetes.io/projected/8ce3d73e-f519-423b-81c0-d120a416b488-kube-api-access-cjq6m\") pod \"cloudkitty-db-sync-f9bxt\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.817103 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:09 crc kubenswrapper[4914]: I0130 21:38:09.839468 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0548f63-8249-4708-88d9-b3f663b28778" path="/var/lib/kubelet/pods/a0548f63-8249-4708-88d9-b3f663b28778/volumes" Jan 30 21:38:10 crc kubenswrapper[4914]: I0130 21:38:10.373075 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-f9bxt"] Jan 30 21:38:10 crc kubenswrapper[4914]: I0130 21:38:10.939498 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-f9bxt" event={"ID":"8ce3d73e-f519-423b-81c0-d120a416b488","Type":"ContainerStarted","Data":"863abe8d1c7f1c569da5a45a039691412553cb8452748bb645bba991acab0053"} Jan 30 21:38:10 crc kubenswrapper[4914]: I0130 21:38:10.939916 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-f9bxt" event={"ID":"8ce3d73e-f519-423b-81c0-d120a416b488","Type":"ContainerStarted","Data":"ccd484824f2faf76970e27ee58c4c2b88496bdf1324d82b0c29169d7125f4492"} Jan 30 21:38:10 crc kubenswrapper[4914]: I0130 21:38:10.958847 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-f9bxt" podStartSLOduration=1.683575509 podStartE2EDuration="1.958833818s" podCreationTimestamp="2026-01-30 21:38:09 +0000 UTC" firstStartedPulling="2026-01-30 21:38:10.385143904 +0000 UTC m=+1423.823780665" lastFinishedPulling="2026-01-30 21:38:10.660402213 +0000 UTC m=+1424.099038974" observedRunningTime="2026-01-30 21:38:10.957482275 +0000 UTC m=+1424.396119036" watchObservedRunningTime="2026-01-30 21:38:10.958833818 +0000 UTC m=+1424.397470579" Jan 30 21:38:11 crc kubenswrapper[4914]: I0130 21:38:11.333890 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:11 crc kubenswrapper[4914]: I0130 21:38:11.334199 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="ceilometer-central-agent" containerID="cri-o://c137c0d9d4c34a980d068a6d9afe97ff7d2ff7fb6e8368d990c2a0b69ae4356a" gracePeriod=30 Jan 30 21:38:11 crc kubenswrapper[4914]: I0130 21:38:11.334238 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="sg-core" containerID="cri-o://637173705c6758c96e2f74e123347162e6406afd2ca310ef807710a339a2d267" gracePeriod=30 Jan 30 21:38:11 crc kubenswrapper[4914]: I0130 21:38:11.334245 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="proxy-httpd" containerID="cri-o://a5c40f88a01bdeebb7910835c95d0ff73d2755e653b1ae63eb411c275266e855" gracePeriod=30 Jan 30 21:38:11 crc kubenswrapper[4914]: I0130 21:38:11.334310 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="ceilometer-notification-agent" containerID="cri-o://2d18ebea566347b4834ffdf0dd7c1479c533c39dc9ed79e4f0abfc93cf843c54" gracePeriod=30 Jan 30 21:38:11 crc kubenswrapper[4914]: I0130 21:38:11.833552 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:38:11 crc kubenswrapper[4914]: I0130 21:38:11.951762 4914 generic.go:334] "Generic (PLEG): container finished" podID="b0925713-10df-46ab-b311-1a286b6c4515" containerID="a5c40f88a01bdeebb7910835c95d0ff73d2755e653b1ae63eb411c275266e855" exitCode=0 Jan 30 21:38:11 crc kubenswrapper[4914]: I0130 21:38:11.951794 4914 generic.go:334] "Generic (PLEG): container finished" podID="b0925713-10df-46ab-b311-1a286b6c4515" containerID="637173705c6758c96e2f74e123347162e6406afd2ca310ef807710a339a2d267" exitCode=2 Jan 30 21:38:11 crc kubenswrapper[4914]: I0130 21:38:11.951804 4914 generic.go:334] "Generic (PLEG): container finished" podID="b0925713-10df-46ab-b311-1a286b6c4515" containerID="c137c0d9d4c34a980d068a6d9afe97ff7d2ff7fb6e8368d990c2a0b69ae4356a" exitCode=0 Jan 30 21:38:11 crc kubenswrapper[4914]: I0130 21:38:11.952586 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0925713-10df-46ab-b311-1a286b6c4515","Type":"ContainerDied","Data":"a5c40f88a01bdeebb7910835c95d0ff73d2755e653b1ae63eb411c275266e855"} Jan 30 21:38:11 crc kubenswrapper[4914]: I0130 21:38:11.952612 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0925713-10df-46ab-b311-1a286b6c4515","Type":"ContainerDied","Data":"637173705c6758c96e2f74e123347162e6406afd2ca310ef807710a339a2d267"} Jan 30 21:38:11 crc kubenswrapper[4914]: I0130 21:38:11.952622 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0925713-10df-46ab-b311-1a286b6c4515","Type":"ContainerDied","Data":"c137c0d9d4c34a980d068a6d9afe97ff7d2ff7fb6e8368d990c2a0b69ae4356a"} Jan 30 21:38:12 crc kubenswrapper[4914]: I0130 21:38:12.736038 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:38:12 crc kubenswrapper[4914]: I0130 21:38:12.963556 4914 generic.go:334] "Generic (PLEG): container finished" podID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerID="3fc7c6999454190607a44a3782b41950bd59fa77c68315821e44554b60bc5da1" exitCode=0 Jan 30 21:38:12 crc kubenswrapper[4914]: I0130 21:38:12.963607 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kk2g" event={"ID":"ae495b2b-b99b-4051-bd64-c54667d4d9bc","Type":"ContainerDied","Data":"3fc7c6999454190607a44a3782b41950bd59fa77c68315821e44554b60bc5da1"} Jan 30 21:38:12 crc kubenswrapper[4914]: I0130 21:38:12.973119 4914 generic.go:334] "Generic (PLEG): container finished" podID="b0925713-10df-46ab-b311-1a286b6c4515" containerID="2d18ebea566347b4834ffdf0dd7c1479c533c39dc9ed79e4f0abfc93cf843c54" exitCode=0 Jan 30 21:38:12 crc kubenswrapper[4914]: I0130 21:38:12.973158 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0925713-10df-46ab-b311-1a286b6c4515","Type":"ContainerDied","Data":"2d18ebea566347b4834ffdf0dd7c1479c533c39dc9ed79e4f0abfc93cf843c54"} Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.252015 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.370786 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-combined-ca-bundle\") pod \"b0925713-10df-46ab-b311-1a286b6c4515\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.370841 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-sg-core-conf-yaml\") pod \"b0925713-10df-46ab-b311-1a286b6c4515\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.370880 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvgcr\" (UniqueName: \"kubernetes.io/projected/b0925713-10df-46ab-b311-1a286b6c4515-kube-api-access-fvgcr\") pod \"b0925713-10df-46ab-b311-1a286b6c4515\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.370961 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0925713-10df-46ab-b311-1a286b6c4515-run-httpd\") pod \"b0925713-10df-46ab-b311-1a286b6c4515\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.371006 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-scripts\") pod \"b0925713-10df-46ab-b311-1a286b6c4515\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.371028 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-config-data\") pod \"b0925713-10df-46ab-b311-1a286b6c4515\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.371084 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0925713-10df-46ab-b311-1a286b6c4515-log-httpd\") pod \"b0925713-10df-46ab-b311-1a286b6c4515\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.371160 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-ceilometer-tls-certs\") pod \"b0925713-10df-46ab-b311-1a286b6c4515\" (UID: \"b0925713-10df-46ab-b311-1a286b6c4515\") " Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.371365 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0925713-10df-46ab-b311-1a286b6c4515-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b0925713-10df-46ab-b311-1a286b6c4515" (UID: "b0925713-10df-46ab-b311-1a286b6c4515"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.371613 4914 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0925713-10df-46ab-b311-1a286b6c4515-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.371865 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0925713-10df-46ab-b311-1a286b6c4515-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b0925713-10df-46ab-b311-1a286b6c4515" (UID: "b0925713-10df-46ab-b311-1a286b6c4515"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.398075 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-scripts" (OuterVolumeSpecName: "scripts") pod "b0925713-10df-46ab-b311-1a286b6c4515" (UID: "b0925713-10df-46ab-b311-1a286b6c4515"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.399909 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0925713-10df-46ab-b311-1a286b6c4515-kube-api-access-fvgcr" (OuterVolumeSpecName: "kube-api-access-fvgcr") pod "b0925713-10df-46ab-b311-1a286b6c4515" (UID: "b0925713-10df-46ab-b311-1a286b6c4515"). InnerVolumeSpecName "kube-api-access-fvgcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.438784 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b0925713-10df-46ab-b311-1a286b6c4515" (UID: "b0925713-10df-46ab-b311-1a286b6c4515"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.476117 4914 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.476299 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvgcr\" (UniqueName: \"kubernetes.io/projected/b0925713-10df-46ab-b311-1a286b6c4515-kube-api-access-fvgcr\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.476352 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.476401 4914 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0925713-10df-46ab-b311-1a286b6c4515-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.489960 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0925713-10df-46ab-b311-1a286b6c4515" (UID: "b0925713-10df-46ab-b311-1a286b6c4515"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.493840 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b0925713-10df-46ab-b311-1a286b6c4515" (UID: "b0925713-10df-46ab-b311-1a286b6c4515"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.543819 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-config-data" (OuterVolumeSpecName: "config-data") pod "b0925713-10df-46ab-b311-1a286b6c4515" (UID: "b0925713-10df-46ab-b311-1a286b6c4515"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.578449 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.578510 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.578522 4914 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0925713-10df-46ab-b311-1a286b6c4515-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:13 crc kubenswrapper[4914]: E0130 21:38:13.950418 4914 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0925713_10df_46ab_b311_1a286b6c4515.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0925713_10df_46ab_b311_1a286b6c4515.slice/crio-3256250860d0a832432bfe7e47c29f9ca08c6d2f59cbb94c6f5922bddef2c3b7\": RecentStats: unable to find data in memory cache]" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.984575 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0925713-10df-46ab-b311-1a286b6c4515","Type":"ContainerDied","Data":"3256250860d0a832432bfe7e47c29f9ca08c6d2f59cbb94c6f5922bddef2c3b7"} Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.984631 4914 scope.go:117] "RemoveContainer" containerID="a5c40f88a01bdeebb7910835c95d0ff73d2755e653b1ae63eb411c275266e855" Jan 30 21:38:13 crc kubenswrapper[4914]: I0130 21:38:13.984814 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.011322 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.030045 4914 scope.go:117] "RemoveContainer" containerID="637173705c6758c96e2f74e123347162e6406afd2ca310ef807710a339a2d267" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.032832 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.048967 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:14 crc kubenswrapper[4914]: E0130 21:38:14.049377 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="ceilometer-notification-agent" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.049394 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="ceilometer-notification-agent" Jan 30 21:38:14 crc kubenswrapper[4914]: E0130 21:38:14.049404 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="ceilometer-central-agent" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.049411 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="ceilometer-central-agent" Jan 30 21:38:14 crc kubenswrapper[4914]: E0130 21:38:14.049471 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="proxy-httpd" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.049478 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="proxy-httpd" Jan 30 21:38:14 crc kubenswrapper[4914]: E0130 21:38:14.049491 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="sg-core" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.049497 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="sg-core" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.049688 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="proxy-httpd" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.049725 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="ceilometer-central-agent" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.049739 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="ceilometer-notification-agent" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.049746 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0925713-10df-46ab-b311-1a286b6c4515" containerName="sg-core" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.051512 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.054306 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.055299 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.066972 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.081754 4914 scope.go:117] "RemoveContainer" containerID="2d18ebea566347b4834ffdf0dd7c1479c533c39dc9ed79e4f0abfc93cf843c54" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.085107 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.142227 4914 scope.go:117] "RemoveContainer" containerID="c137c0d9d4c34a980d068a6d9afe97ff7d2ff7fb6e8368d990c2a0b69ae4356a" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.193248 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.193311 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psn58\" (UniqueName: \"kubernetes.io/projected/49463c19-f32a-4288-9a5a-51d9c7b11e42-kube-api-access-psn58\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.193426 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49463c19-f32a-4288-9a5a-51d9c7b11e42-log-httpd\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.193496 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-scripts\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.193665 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.193889 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.194012 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-config-data\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.194105 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49463c19-f32a-4288-9a5a-51d9c7b11e42-run-httpd\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.295571 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-scripts\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.295623 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.295666 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.295693 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-config-data\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.295733 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49463c19-f32a-4288-9a5a-51d9c7b11e42-run-httpd\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.295786 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.295807 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psn58\" (UniqueName: \"kubernetes.io/projected/49463c19-f32a-4288-9a5a-51d9c7b11e42-kube-api-access-psn58\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.295840 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49463c19-f32a-4288-9a5a-51d9c7b11e42-log-httpd\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.296242 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49463c19-f32a-4288-9a5a-51d9c7b11e42-log-httpd\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.297224 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49463c19-f32a-4288-9a5a-51d9c7b11e42-run-httpd\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.303596 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.304204 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-scripts\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.306352 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.306821 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-config-data\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.308288 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49463c19-f32a-4288-9a5a-51d9c7b11e42-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.325494 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psn58\" (UniqueName: \"kubernetes.io/projected/49463c19-f32a-4288-9a5a-51d9c7b11e42-kube-api-access-psn58\") pod \"ceilometer-0\" (UID: \"49463c19-f32a-4288-9a5a-51d9c7b11e42\") " pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.391189 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:14 crc kubenswrapper[4914]: W0130 21:38:14.972729 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49463c19_f32a_4288_9a5a_51d9c7b11e42.slice/crio-09232c04b5cd20af22fb8b05095cd8fa977b936296c895b7db701e242213749d WatchSource:0}: Error finding container 09232c04b5cd20af22fb8b05095cd8fa977b936296c895b7db701e242213749d: Status 404 returned error can't find the container with id 09232c04b5cd20af22fb8b05095cd8fa977b936296c895b7db701e242213749d Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.973443 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.997178 4914 generic.go:334] "Generic (PLEG): container finished" podID="8ce3d73e-f519-423b-81c0-d120a416b488" containerID="863abe8d1c7f1c569da5a45a039691412553cb8452748bb645bba991acab0053" exitCode=0 Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.997245 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-f9bxt" event={"ID":"8ce3d73e-f519-423b-81c0-d120a416b488","Type":"ContainerDied","Data":"863abe8d1c7f1c569da5a45a039691412553cb8452748bb645bba991acab0053"} Jan 30 21:38:14 crc kubenswrapper[4914]: I0130 21:38:14.999532 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49463c19-f32a-4288-9a5a-51d9c7b11e42","Type":"ContainerStarted","Data":"09232c04b5cd20af22fb8b05095cd8fa977b936296c895b7db701e242213749d"} Jan 30 21:38:15 crc kubenswrapper[4914]: I0130 21:38:15.002066 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kk2g" event={"ID":"ae495b2b-b99b-4051-bd64-c54667d4d9bc","Type":"ContainerStarted","Data":"1efcced53d4c59f33eecabc32c02a82a327a73342e8b369fa2aa989e0130df29"} Jan 30 21:38:15 crc kubenswrapper[4914]: I0130 21:38:15.043294 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7kk2g" podStartSLOduration=3.863284877 podStartE2EDuration="16.043278685s" podCreationTimestamp="2026-01-30 21:37:59 +0000 UTC" firstStartedPulling="2026-01-30 21:38:01.849862423 +0000 UTC m=+1415.288499184" lastFinishedPulling="2026-01-30 21:38:14.029856211 +0000 UTC m=+1427.468492992" observedRunningTime="2026-01-30 21:38:15.03487187 +0000 UTC m=+1428.473508641" watchObservedRunningTime="2026-01-30 21:38:15.043278685 +0000 UTC m=+1428.481915446" Jan 30 21:38:15 crc kubenswrapper[4914]: I0130 21:38:15.838630 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0925713-10df-46ab-b311-1a286b6c4515" path="/var/lib/kubelet/pods/b0925713-10df-46ab-b311-1a286b6c4515/volumes" Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.496626 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.508994 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c506e0ae-e4b2-4cd7-87ea-bc10619f874e" containerName="rabbitmq" containerID="cri-o://70cbbbbafe8cea99c64866db7ecbc122e676e7f4c10a9cddda720373900c0b08" gracePeriod=604796 Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.644945 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-combined-ca-bundle\") pod \"8ce3d73e-f519-423b-81c0-d120a416b488\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.645046 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ce3d73e-f519-423b-81c0-d120a416b488-certs\") pod \"8ce3d73e-f519-423b-81c0-d120a416b488\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.645122 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-scripts\") pod \"8ce3d73e-f519-423b-81c0-d120a416b488\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.645273 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjq6m\" (UniqueName: \"kubernetes.io/projected/8ce3d73e-f519-423b-81c0-d120a416b488-kube-api-access-cjq6m\") pod \"8ce3d73e-f519-423b-81c0-d120a416b488\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.645446 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-config-data\") pod \"8ce3d73e-f519-423b-81c0-d120a416b488\" (UID: \"8ce3d73e-f519-423b-81c0-d120a416b488\") " Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.650775 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ce3d73e-f519-423b-81c0-d120a416b488-certs" (OuterVolumeSpecName: "certs") pod "8ce3d73e-f519-423b-81c0-d120a416b488" (UID: "8ce3d73e-f519-423b-81c0-d120a416b488"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.653549 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-scripts" (OuterVolumeSpecName: "scripts") pod "8ce3d73e-f519-423b-81c0-d120a416b488" (UID: "8ce3d73e-f519-423b-81c0-d120a416b488"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.692863 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ce3d73e-f519-423b-81c0-d120a416b488-kube-api-access-cjq6m" (OuterVolumeSpecName: "kube-api-access-cjq6m") pod "8ce3d73e-f519-423b-81c0-d120a416b488" (UID: "8ce3d73e-f519-423b-81c0-d120a416b488"). InnerVolumeSpecName "kube-api-access-cjq6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.700868 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ce3d73e-f519-423b-81c0-d120a416b488" (UID: "8ce3d73e-f519-423b-81c0-d120a416b488"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.701009 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-config-data" (OuterVolumeSpecName: "config-data") pod "8ce3d73e-f519-423b-81c0-d120a416b488" (UID: "8ce3d73e-f519-423b-81c0-d120a416b488"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.749533 4914 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8ce3d73e-f519-423b-81c0-d120a416b488-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.749567 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.749580 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjq6m\" (UniqueName: \"kubernetes.io/projected/8ce3d73e-f519-423b-81c0-d120a416b488-kube-api-access-cjq6m\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.749593 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:16 crc kubenswrapper[4914]: I0130 21:38:16.749607 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ce3d73e-f519-423b-81c0-d120a416b488-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.021456 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-f9bxt" event={"ID":"8ce3d73e-f519-423b-81c0-d120a416b488","Type":"ContainerDied","Data":"ccd484824f2faf76970e27ee58c4c2b88496bdf1324d82b0c29169d7125f4492"} Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.021753 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-f9bxt" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.021763 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccd484824f2faf76970e27ee58c4c2b88496bdf1324d82b0c29169d7125f4492" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.093917 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-m6cp5"] Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.105400 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-m6cp5"] Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.210610 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-klrhc"] Jan 30 21:38:17 crc kubenswrapper[4914]: E0130 21:38:17.211030 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ce3d73e-f519-423b-81c0-d120a416b488" containerName="cloudkitty-db-sync" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.211046 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ce3d73e-f519-423b-81c0-d120a416b488" containerName="cloudkitty-db-sync" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.211239 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ce3d73e-f519-423b-81c0-d120a416b488" containerName="cloudkitty-db-sync" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.212209 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.226983 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-klrhc"] Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.227238 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.363132 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-config-data\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.363175 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/097daca3-adbd-4d0a-8606-876822a0cd4a-certs\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.363356 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrbmb\" (UniqueName: \"kubernetes.io/projected/097daca3-adbd-4d0a-8606-876822a0cd4a-kube-api-access-nrbmb\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.363670 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-scripts\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.363878 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-combined-ca-bundle\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.465418 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-combined-ca-bundle\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.465543 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-config-data\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.465575 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/097daca3-adbd-4d0a-8606-876822a0cd4a-certs\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.465631 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrbmb\" (UniqueName: \"kubernetes.io/projected/097daca3-adbd-4d0a-8606-876822a0cd4a-kube-api-access-nrbmb\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.465761 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-scripts\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.473676 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-scripts\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.473777 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-combined-ca-bundle\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.473827 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/097daca3-adbd-4d0a-8606-876822a0cd4a-certs\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.482606 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-config-data\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.495390 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrbmb\" (UniqueName: \"kubernetes.io/projected/097daca3-adbd-4d0a-8606-876822a0cd4a-kube-api-access-nrbmb\") pod \"cloudkitty-storageinit-klrhc\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.526797 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.718613 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c506e0ae-e4b2-4cd7-87ea-bc10619f874e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Jan 30 21:38:17 crc kubenswrapper[4914]: I0130 21:38:17.835431 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b23541-8035-417c-8ea6-69cf3c0b2758" path="/var/lib/kubelet/pods/60b23541-8035-417c-8ea6-69cf3c0b2758/volumes" Jan 30 21:38:18 crc kubenswrapper[4914]: I0130 21:38:18.292259 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="f394410a-5ff7-4a0c-84ec-4b60c63c707c" containerName="rabbitmq" containerID="cri-o://81433ff0f628af87671a18cfd72c656192e9c4f0ecad8d43815099cf1bfc51c1" gracePeriod=604795 Jan 30 21:38:20 crc kubenswrapper[4914]: I0130 21:38:20.325286 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:38:20 crc kubenswrapper[4914]: I0130 21:38:20.325566 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:38:20 crc kubenswrapper[4914]: I0130 21:38:20.951207 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-klrhc"] Jan 30 21:38:21 crc kubenswrapper[4914]: I0130 21:38:21.070896 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-klrhc" event={"ID":"097daca3-adbd-4d0a-8606-876822a0cd4a","Type":"ContainerStarted","Data":"a067cbb10bd8825cf9e2f8e12892e4d6229e4c6593352cbe92c233e5c8b7d3db"} Jan 30 21:38:21 crc kubenswrapper[4914]: I0130 21:38:21.395356 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7kk2g" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="registry-server" probeResult="failure" output=< Jan 30 21:38:21 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:38:21 crc kubenswrapper[4914]: > Jan 30 21:38:22 crc kubenswrapper[4914]: I0130 21:38:22.087502 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-klrhc" event={"ID":"097daca3-adbd-4d0a-8606-876822a0cd4a","Type":"ContainerStarted","Data":"19aa1cbe155106ce92b6c981189943363d426f118a6814e2ffba0bc2f62e79a7"} Jan 30 21:38:22 crc kubenswrapper[4914]: I0130 21:38:22.089513 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49463c19-f32a-4288-9a5a-51d9c7b11e42","Type":"ContainerStarted","Data":"1ddbc56dfb9b3d6f78b5a74430ef09caa62dc217a1510d2031e4c19bb59bbd4b"} Jan 30 21:38:22 crc kubenswrapper[4914]: I0130 21:38:22.113114 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-klrhc" podStartSLOduration=5.113096262 podStartE2EDuration="5.113096262s" podCreationTimestamp="2026-01-30 21:38:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:22.105930717 +0000 UTC m=+1435.544567488" watchObservedRunningTime="2026-01-30 21:38:22.113096262 +0000 UTC m=+1435.551733023" Jan 30 21:38:23 crc kubenswrapper[4914]: I0130 21:38:23.100105 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49463c19-f32a-4288-9a5a-51d9c7b11e42","Type":"ContainerStarted","Data":"161df829bc9e0d714bcfb2a3f3f26b5c07e955574de5f64cc808119332cc51c2"} Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.141131 4914 generic.go:334] "Generic (PLEG): container finished" podID="c506e0ae-e4b2-4cd7-87ea-bc10619f874e" containerID="70cbbbbafe8cea99c64866db7ecbc122e676e7f4c10a9cddda720373900c0b08" exitCode=0 Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.141545 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c506e0ae-e4b2-4cd7-87ea-bc10619f874e","Type":"ContainerDied","Data":"70cbbbbafe8cea99c64866db7ecbc122e676e7f4c10a9cddda720373900c0b08"} Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.141569 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c506e0ae-e4b2-4cd7-87ea-bc10619f874e","Type":"ContainerDied","Data":"a2496c0d0ae4fc3f67cdd3ffe07d3825c547990ddea2f3de0bdb1f06d1255103"} Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.141580 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2496c0d0ae4fc3f67cdd3ffe07d3825c547990ddea2f3de0bdb1f06d1255103" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.156850 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-txsxt"] Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.159104 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.160031 4914 generic.go:334] "Generic (PLEG): container finished" podID="097daca3-adbd-4d0a-8606-876822a0cd4a" containerID="19aa1cbe155106ce92b6c981189943363d426f118a6814e2ffba0bc2f62e79a7" exitCode=0 Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.160091 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-klrhc" event={"ID":"097daca3-adbd-4d0a-8606-876822a0cd4a","Type":"ContainerDied","Data":"19aa1cbe155106ce92b6c981189943363d426f118a6814e2ffba0bc2f62e79a7"} Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.162329 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.164308 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49463c19-f32a-4288-9a5a-51d9c7b11e42","Type":"ContainerStarted","Data":"4a40989b1bc7b597a20e51f1e2d550c0b02da796ead08f614cab7d21e7c67533"} Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.176574 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-txsxt"] Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.178969 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.287548 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmk5r\" (UniqueName: \"kubernetes.io/projected/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-kube-api-access-jmk5r\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.287764 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.287958 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.287990 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.288019 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.289889 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-config\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.289922 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.391691 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2f7h\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-kube-api-access-f2f7h\") pod \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.401378 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f\") pod \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.401452 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-pod-info\") pod \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.401508 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-tls\") pod \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.408353 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-erlang-cookie-secret\") pod \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.408395 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-config-data\") pod \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.430787 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-server-conf\") pod \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.430907 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-erlang-cookie\") pod \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.430934 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-plugins\") pod \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.431007 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-plugins-conf\") pod \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.431032 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-confd\") pod \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") " Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.431290 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-config\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.431353 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.431477 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmk5r\" (UniqueName: \"kubernetes.io/projected/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-kube-api-access-jmk5r\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.431642 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.431843 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.431873 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.431900 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.433513 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-config\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.434980 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c506e0ae-e4b2-4cd7-87ea-bc10619f874e" (UID: "c506e0ae-e4b2-4cd7-87ea-bc10619f874e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.443273 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.443953 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.444279 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.444885 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.448221 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c506e0ae-e4b2-4cd7-87ea-bc10619f874e" (UID: "c506e0ae-e4b2-4cd7-87ea-bc10619f874e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.451246 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.452951 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c506e0ae-e4b2-4cd7-87ea-bc10619f874e" (UID: "c506e0ae-e4b2-4cd7-87ea-bc10619f874e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.470843 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c506e0ae-e4b2-4cd7-87ea-bc10619f874e" (UID: "c506e0ae-e4b2-4cd7-87ea-bc10619f874e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.472154 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-pod-info" (OuterVolumeSpecName: "pod-info") pod "c506e0ae-e4b2-4cd7-87ea-bc10619f874e" (UID: "c506e0ae-e4b2-4cd7-87ea-bc10619f874e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.551640 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-kube-api-access-f2f7h" (OuterVolumeSpecName: "kube-api-access-f2f7h") pod "c506e0ae-e4b2-4cd7-87ea-bc10619f874e" (UID: "c506e0ae-e4b2-4cd7-87ea-bc10619f874e"). InnerVolumeSpecName "kube-api-access-f2f7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.552007 4914 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.552060 4914 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.552075 4914 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.552086 4914 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.552097 4914 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.570980 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c506e0ae-e4b2-4cd7-87ea-bc10619f874e" (UID: "c506e0ae-e4b2-4cd7-87ea-bc10619f874e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.604630 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmk5r\" (UniqueName: \"kubernetes.io/projected/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-kube-api-access-jmk5r\") pod \"dnsmasq-dns-dbb88bf8c-txsxt\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.640692 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-config-data" (OuterVolumeSpecName: "config-data") pod "c506e0ae-e4b2-4cd7-87ea-bc10619f874e" (UID: "c506e0ae-e4b2-4cd7-87ea-bc10619f874e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.655438 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2f7h\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-kube-api-access-f2f7h\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.655466 4914 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.655475 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.726467 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-server-conf" (OuterVolumeSpecName: "server-conf") pod "c506e0ae-e4b2-4cd7-87ea-bc10619f874e" (UID: "c506e0ae-e4b2-4cd7-87ea-bc10619f874e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:24 crc kubenswrapper[4914]: E0130 21:38:24.756559 4914 reconciler_common.go:156] "operationExecutor.UnmountVolume failed (controllerAttachDetachEnabled true) for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f\") pod \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") : UnmountVolume.NewUnmounter failed for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f\") pod \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") : kubernetes.io/csi: unmounter failed to load volume data file [/var/lib/kubelet/pods/c506e0ae-e4b2-4cd7-87ea-bc10619f874e/volumes/kubernetes.io~csi/pvc-25113a91-49b5-491c-99f4-7569d427709f/mount]: kubernetes.io/csi: failed to open volume data file [/var/lib/kubelet/pods/c506e0ae-e4b2-4cd7-87ea-bc10619f874e/volumes/kubernetes.io~csi/pvc-25113a91-49b5-491c-99f4-7569d427709f/vol_data.json]: open /var/lib/kubelet/pods/c506e0ae-e4b2-4cd7-87ea-bc10619f874e/volumes/kubernetes.io~csi/pvc-25113a91-49b5-491c-99f4-7569d427709f/vol_data.json: no such file or directory" err="UnmountVolume.NewUnmounter failed for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f\") pod \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\" (UID: \"c506e0ae-e4b2-4cd7-87ea-bc10619f874e\") : kubernetes.io/csi: unmounter failed to load volume data file [/var/lib/kubelet/pods/c506e0ae-e4b2-4cd7-87ea-bc10619f874e/volumes/kubernetes.io~csi/pvc-25113a91-49b5-491c-99f4-7569d427709f/mount]: kubernetes.io/csi: failed to open volume data file [/var/lib/kubelet/pods/c506e0ae-e4b2-4cd7-87ea-bc10619f874e/volumes/kubernetes.io~csi/pvc-25113a91-49b5-491c-99f4-7569d427709f/vol_data.json]: open /var/lib/kubelet/pods/c506e0ae-e4b2-4cd7-87ea-bc10619f874e/volumes/kubernetes.io~csi/pvc-25113a91-49b5-491c-99f4-7569d427709f/vol_data.json: no such file or directory" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.757093 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f" (OuterVolumeSpecName: "persistence") pod "c506e0ae-e4b2-4cd7-87ea-bc10619f874e" (UID: "c506e0ae-e4b2-4cd7-87ea-bc10619f874e"). InnerVolumeSpecName "pvc-25113a91-49b5-491c-99f4-7569d427709f". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.757335 4914 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.757372 4914 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-25113a91-49b5-491c-99f4-7569d427709f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f\") on node \"crc\" " Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.790305 4914 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.791304 4914 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-25113a91-49b5-491c-99f4-7569d427709f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f") on node "crc" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.798077 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.815594 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c506e0ae-e4b2-4cd7-87ea-bc10619f874e" (UID: "c506e0ae-e4b2-4cd7-87ea-bc10619f874e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.858825 4914 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c506e0ae-e4b2-4cd7-87ea-bc10619f874e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:24 crc kubenswrapper[4914]: I0130 21:38:24.859137 4914 reconciler_common.go:293] "Volume detached for volume \"pvc-25113a91-49b5-491c-99f4-7569d427709f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.180921 4914 generic.go:334] "Generic (PLEG): container finished" podID="f394410a-5ff7-4a0c-84ec-4b60c63c707c" containerID="81433ff0f628af87671a18cfd72c656192e9c4f0ecad8d43815099cf1bfc51c1" exitCode=0 Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.181469 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f394410a-5ff7-4a0c-84ec-4b60c63c707c","Type":"ContainerDied","Data":"81433ff0f628af87671a18cfd72c656192e9c4f0ecad8d43815099cf1bfc51c1"} Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.181503 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f394410a-5ff7-4a0c-84ec-4b60c63c707c","Type":"ContainerDied","Data":"cdaf857891566689b9d960e23755cfc5fe462fa524e87d277df9c1cacdaa1b5b"} Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.181516 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdaf857891566689b9d960e23755cfc5fe462fa524e87d277df9c1cacdaa1b5b" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.181582 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.247496 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.282350 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.299394 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.320291 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:38:25 crc kubenswrapper[4914]: E0130 21:38:25.321115 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f394410a-5ff7-4a0c-84ec-4b60c63c707c" containerName="setup-container" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.321142 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f394410a-5ff7-4a0c-84ec-4b60c63c707c" containerName="setup-container" Jan 30 21:38:25 crc kubenswrapper[4914]: E0130 21:38:25.321154 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c506e0ae-e4b2-4cd7-87ea-bc10619f874e" containerName="setup-container" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.321164 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="c506e0ae-e4b2-4cd7-87ea-bc10619f874e" containerName="setup-container" Jan 30 21:38:25 crc kubenswrapper[4914]: E0130 21:38:25.321191 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f394410a-5ff7-4a0c-84ec-4b60c63c707c" containerName="rabbitmq" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.321200 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f394410a-5ff7-4a0c-84ec-4b60c63c707c" containerName="rabbitmq" Jan 30 21:38:25 crc kubenswrapper[4914]: E0130 21:38:25.321220 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c506e0ae-e4b2-4cd7-87ea-bc10619f874e" containerName="rabbitmq" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.321228 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="c506e0ae-e4b2-4cd7-87ea-bc10619f874e" containerName="rabbitmq" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.321491 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f394410a-5ff7-4a0c-84ec-4b60c63c707c" containerName="rabbitmq" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.321516 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="c506e0ae-e4b2-4cd7-87ea-bc10619f874e" containerName="rabbitmq" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.323037 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.328034 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.328196 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.332288 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.332365 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.335240 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.335494 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ldxml" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.344581 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.369395 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-plugins\") pod \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.369460 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f394410a-5ff7-4a0c-84ec-4b60c63c707c-erlang-cookie-secret\") pod \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.369485 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-plugins-conf\") pod \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.369535 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-config-data\") pod \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.369562 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-erlang-cookie\") pod \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.370806 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f394410a-5ff7-4a0c-84ec-4b60c63c707c" (UID: "f394410a-5ff7-4a0c-84ec-4b60c63c707c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.371797 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22914768-1216-46d8-b41a-338cdc0e977f\") pod \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.371843 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-confd\") pod \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.371864 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-tls\") pod \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.371904 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-server-conf\") pod \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.371961 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hqh2\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-kube-api-access-6hqh2\") pod \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.372079 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f394410a-5ff7-4a0c-84ec-4b60c63c707c-pod-info\") pod \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\" (UID: \"f394410a-5ff7-4a0c-84ec-4b60c63c707c\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.372469 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f394410a-5ff7-4a0c-84ec-4b60c63c707c" (UID: "f394410a-5ff7-4a0c-84ec-4b60c63c707c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.373269 4914 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.373292 4914 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.376949 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f394410a-5ff7-4a0c-84ec-4b60c63c707c" (UID: "f394410a-5ff7-4a0c-84ec-4b60c63c707c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.381141 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f394410a-5ff7-4a0c-84ec-4b60c63c707c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f394410a-5ff7-4a0c-84ec-4b60c63c707c" (UID: "f394410a-5ff7-4a0c-84ec-4b60c63c707c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.387404 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.388648 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f394410a-5ff7-4a0c-84ec-4b60c63c707c-pod-info" (OuterVolumeSpecName: "pod-info") pod "f394410a-5ff7-4a0c-84ec-4b60c63c707c" (UID: "f394410a-5ff7-4a0c-84ec-4b60c63c707c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.404847 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-kube-api-access-6hqh2" (OuterVolumeSpecName: "kube-api-access-6hqh2") pod "f394410a-5ff7-4a0c-84ec-4b60c63c707c" (UID: "f394410a-5ff7-4a0c-84ec-4b60c63c707c"). InnerVolumeSpecName "kube-api-access-6hqh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.408883 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f394410a-5ff7-4a0c-84ec-4b60c63c707c" (UID: "f394410a-5ff7-4a0c-84ec-4b60c63c707c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.450963 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-config-data" (OuterVolumeSpecName: "config-data") pod "f394410a-5ff7-4a0c-84ec-4b60c63c707c" (UID: "f394410a-5ff7-4a0c-84ec-4b60c63c707c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.475424 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc011821-8710-499b-8547-4ab18c9e2592-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.475500 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc011821-8710-499b-8547-4ab18c9e2592-config-data\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.475517 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc011821-8710-499b-8547-4ab18c9e2592-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.475544 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-25113a91-49b5-491c-99f4-7569d427709f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.475986 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc011821-8710-499b-8547-4ab18c9e2592-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.476295 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc011821-8710-499b-8547-4ab18c9e2592-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.476347 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc011821-8710-499b-8547-4ab18c9e2592-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.476389 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mt2h\" (UniqueName: \"kubernetes.io/projected/bc011821-8710-499b-8547-4ab18c9e2592-kube-api-access-8mt2h\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.476494 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc011821-8710-499b-8547-4ab18c9e2592-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.476591 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc011821-8710-499b-8547-4ab18c9e2592-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.476634 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc011821-8710-499b-8547-4ab18c9e2592-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.476752 4914 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f394410a-5ff7-4a0c-84ec-4b60c63c707c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.476769 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.476779 4914 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.476787 4914 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.476796 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hqh2\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-kube-api-access-6hqh2\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.476805 4914 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f394410a-5ff7-4a0c-84ec-4b60c63c707c-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.483153 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-txsxt"] Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.486201 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22914768-1216-46d8-b41a-338cdc0e977f" (OuterVolumeSpecName: "persistence") pod "f394410a-5ff7-4a0c-84ec-4b60c63c707c" (UID: "f394410a-5ff7-4a0c-84ec-4b60c63c707c"). InnerVolumeSpecName "pvc-22914768-1216-46d8-b41a-338cdc0e977f". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.502188 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-server-conf" (OuterVolumeSpecName: "server-conf") pod "f394410a-5ff7-4a0c-84ec-4b60c63c707c" (UID: "f394410a-5ff7-4a0c-84ec-4b60c63c707c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.580924 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc011821-8710-499b-8547-4ab18c9e2592-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.581261 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc011821-8710-499b-8547-4ab18c9e2592-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.581284 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc011821-8710-499b-8547-4ab18c9e2592-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.581304 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mt2h\" (UniqueName: \"kubernetes.io/projected/bc011821-8710-499b-8547-4ab18c9e2592-kube-api-access-8mt2h\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.581349 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc011821-8710-499b-8547-4ab18c9e2592-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.581389 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc011821-8710-499b-8547-4ab18c9e2592-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.581409 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc011821-8710-499b-8547-4ab18c9e2592-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.581430 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc011821-8710-499b-8547-4ab18c9e2592-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.581479 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc011821-8710-499b-8547-4ab18c9e2592-config-data\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.581511 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc011821-8710-499b-8547-4ab18c9e2592-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.581541 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-25113a91-49b5-491c-99f4-7569d427709f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.581620 4914 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-22914768-1216-46d8-b41a-338cdc0e977f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22914768-1216-46d8-b41a-338cdc0e977f\") on node \"crc\" " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.581633 4914 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f394410a-5ff7-4a0c-84ec-4b60c63c707c-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.583407 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc011821-8710-499b-8547-4ab18c9e2592-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.584417 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc011821-8710-499b-8547-4ab18c9e2592-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.588583 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc011821-8710-499b-8547-4ab18c9e2592-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.589865 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc011821-8710-499b-8547-4ab18c9e2592-config-data\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.589991 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc011821-8710-499b-8547-4ab18c9e2592-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.591129 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc011821-8710-499b-8547-4ab18c9e2592-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.594120 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc011821-8710-499b-8547-4ab18c9e2592-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.594358 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.594393 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-25113a91-49b5-491c-99f4-7569d427709f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5be60c8bd10711c8c5de45ccf3a82e8fb293e62f568476a547ebf3cff93a9a23/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.599458 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc011821-8710-499b-8547-4ab18c9e2592-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.604003 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc011821-8710-499b-8547-4ab18c9e2592-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.607076 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mt2h\" (UniqueName: \"kubernetes.io/projected/bc011821-8710-499b-8547-4ab18c9e2592-kube-api-access-8mt2h\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.638683 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f394410a-5ff7-4a0c-84ec-4b60c63c707c" (UID: "f394410a-5ff7-4a0c-84ec-4b60c63c707c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.670583 4914 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.674900 4914 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-22914768-1216-46d8-b41a-338cdc0e977f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22914768-1216-46d8-b41a-338cdc0e977f") on node "crc" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.683445 4914 reconciler_common.go:293] "Volume detached for volume \"pvc-22914768-1216-46d8-b41a-338cdc0e977f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22914768-1216-46d8-b41a-338cdc0e977f\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.683473 4914 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f394410a-5ff7-4a0c-84ec-4b60c63c707c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.718748 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-25113a91-49b5-491c-99f4-7569d427709f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25113a91-49b5-491c-99f4-7569d427709f\") pod \"rabbitmq-server-0\" (UID: \"bc011821-8710-499b-8547-4ab18c9e2592\") " pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.756222 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.850858 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c506e0ae-e4b2-4cd7-87ea-bc10619f874e" path="/var/lib/kubelet/pods/c506e0ae-e4b2-4cd7-87ea-bc10619f874e/volumes" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.934724 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/097daca3-adbd-4d0a-8606-876822a0cd4a-certs\") pod \"097daca3-adbd-4d0a-8606-876822a0cd4a\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.934788 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-scripts\") pod \"097daca3-adbd-4d0a-8606-876822a0cd4a\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.934846 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-config-data\") pod \"097daca3-adbd-4d0a-8606-876822a0cd4a\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.934940 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrbmb\" (UniqueName: \"kubernetes.io/projected/097daca3-adbd-4d0a-8606-876822a0cd4a-kube-api-access-nrbmb\") pod \"097daca3-adbd-4d0a-8606-876822a0cd4a\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.934998 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-combined-ca-bundle\") pod \"097daca3-adbd-4d0a-8606-876822a0cd4a\" (UID: \"097daca3-adbd-4d0a-8606-876822a0cd4a\") " Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.947771 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-scripts" (OuterVolumeSpecName: "scripts") pod "097daca3-adbd-4d0a-8606-876822a0cd4a" (UID: "097daca3-adbd-4d0a-8606-876822a0cd4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.949176 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/097daca3-adbd-4d0a-8606-876822a0cd4a-kube-api-access-nrbmb" (OuterVolumeSpecName: "kube-api-access-nrbmb") pod "097daca3-adbd-4d0a-8606-876822a0cd4a" (UID: "097daca3-adbd-4d0a-8606-876822a0cd4a"). InnerVolumeSpecName "kube-api-access-nrbmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.949818 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/097daca3-adbd-4d0a-8606-876822a0cd4a-certs" (OuterVolumeSpecName: "certs") pod "097daca3-adbd-4d0a-8606-876822a0cd4a" (UID: "097daca3-adbd-4d0a-8606-876822a0cd4a"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.953129 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.984012 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-config-data" (OuterVolumeSpecName: "config-data") pod "097daca3-adbd-4d0a-8606-876822a0cd4a" (UID: "097daca3-adbd-4d0a-8606-876822a0cd4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:25 crc kubenswrapper[4914]: I0130 21:38:25.998886 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "097daca3-adbd-4d0a-8606-876822a0cd4a" (UID: "097daca3-adbd-4d0a-8606-876822a0cd4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.043593 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.043627 4914 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/097daca3-adbd-4d0a-8606-876822a0cd4a-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.043640 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.043650 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/097daca3-adbd-4d0a-8606-876822a0cd4a-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.043661 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrbmb\" (UniqueName: \"kubernetes.io/projected/097daca3-adbd-4d0a-8606-876822a0cd4a-kube-api-access-nrbmb\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.194622 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-klrhc" event={"ID":"097daca3-adbd-4d0a-8606-876822a0cd4a","Type":"ContainerDied","Data":"a067cbb10bd8825cf9e2f8e12892e4d6229e4c6593352cbe92c233e5c8b7d3db"} Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.194663 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a067cbb10bd8825cf9e2f8e12892e4d6229e4c6593352cbe92c233e5c8b7d3db" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.194624 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-klrhc" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.196215 4914 generic.go:334] "Generic (PLEG): container finished" podID="bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" containerID="630d5d5eba8ac91f024a0e6350e773cf0cf27678782095d591fb8585f57c842a" exitCode=0 Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.196290 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.196389 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" event={"ID":"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9","Type":"ContainerDied","Data":"630d5d5eba8ac91f024a0e6350e773cf0cf27678782095d591fb8585f57c842a"} Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.196435 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" event={"ID":"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9","Type":"ContainerStarted","Data":"043afc734258e55552240c547d49c63d2c2ef2fe142b5a8570be851cc8116de7"} Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.448156 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.478464 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.490536 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.499760 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:38:26 crc kubenswrapper[4914]: E0130 21:38:26.500389 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="097daca3-adbd-4d0a-8606-876822a0cd4a" containerName="cloudkitty-storageinit" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.500456 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="097daca3-adbd-4d0a-8606-876822a0cd4a" containerName="cloudkitty-storageinit" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.500720 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="097daca3-adbd-4d0a-8606-876822a0cd4a" containerName="cloudkitty-storageinit" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.502510 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.511447 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.518077 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.525248 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.525451 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.525586 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.525762 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-c4brf" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.525894 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.526242 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.562158 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.562391 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="d069f103-1546-4a76-963e-2d160d5a347d" containerName="cloudkitty-proc" containerID="cri-o://fdb7b1f8d5119dde9dc159739be80d70038da7d53ba436caed816511baa1b067" gracePeriod=30 Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.579899 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.580174 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="fea8049a-f388-4d46-a567-473849787e27" containerName="cloudkitty-api-log" containerID="cri-o://362ef2b0b58bcd117734262403641f928628a9463750d7f586b6a62c67828b49" gracePeriod=30 Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.580634 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="fea8049a-f388-4d46-a567-473849787e27" containerName="cloudkitty-api" containerID="cri-o://921990d8fa283366833ced4a3720567fbb50029acb68e25fcf13af86f5001b0f" gracePeriod=30 Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.668903 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/149d6b1c-dd4d-4433-906d-6774aeb77afb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.669169 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/149d6b1c-dd4d-4433-906d-6774aeb77afb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.669200 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-22914768-1216-46d8-b41a-338cdc0e977f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22914768-1216-46d8-b41a-338cdc0e977f\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.669237 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/149d6b1c-dd4d-4433-906d-6774aeb77afb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.669310 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/149d6b1c-dd4d-4433-906d-6774aeb77afb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.669379 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/149d6b1c-dd4d-4433-906d-6774aeb77afb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.669411 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/149d6b1c-dd4d-4433-906d-6774aeb77afb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.669434 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7llx5\" (UniqueName: \"kubernetes.io/projected/149d6b1c-dd4d-4433-906d-6774aeb77afb-kube-api-access-7llx5\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.669472 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/149d6b1c-dd4d-4433-906d-6774aeb77afb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.669492 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/149d6b1c-dd4d-4433-906d-6774aeb77afb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.669540 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/149d6b1c-dd4d-4433-906d-6774aeb77afb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.770861 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/149d6b1c-dd4d-4433-906d-6774aeb77afb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.771070 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/149d6b1c-dd4d-4433-906d-6774aeb77afb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.771147 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/149d6b1c-dd4d-4433-906d-6774aeb77afb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.771246 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-22914768-1216-46d8-b41a-338cdc0e977f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22914768-1216-46d8-b41a-338cdc0e977f\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.771317 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/149d6b1c-dd4d-4433-906d-6774aeb77afb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.771424 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/149d6b1c-dd4d-4433-906d-6774aeb77afb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.771521 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/149d6b1c-dd4d-4433-906d-6774aeb77afb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.771599 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/149d6b1c-dd4d-4433-906d-6774aeb77afb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.771766 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7llx5\" (UniqueName: \"kubernetes.io/projected/149d6b1c-dd4d-4433-906d-6774aeb77afb-kube-api-access-7llx5\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.771845 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/149d6b1c-dd4d-4433-906d-6774aeb77afb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.771910 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/149d6b1c-dd4d-4433-906d-6774aeb77afb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.772264 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/149d6b1c-dd4d-4433-906d-6774aeb77afb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.772439 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/149d6b1c-dd4d-4433-906d-6774aeb77afb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.773580 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/149d6b1c-dd4d-4433-906d-6774aeb77afb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.773812 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/149d6b1c-dd4d-4433-906d-6774aeb77afb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.776523 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/149d6b1c-dd4d-4433-906d-6774aeb77afb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.783878 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/149d6b1c-dd4d-4433-906d-6774aeb77afb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.784377 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/149d6b1c-dd4d-4433-906d-6774aeb77afb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.784716 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/149d6b1c-dd4d-4433-906d-6774aeb77afb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.873441 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/149d6b1c-dd4d-4433-906d-6774aeb77afb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.875244 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7llx5\" (UniqueName: \"kubernetes.io/projected/149d6b1c-dd4d-4433-906d-6774aeb77afb-kube-api-access-7llx5\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.904480 4914 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.904532 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-22914768-1216-46d8-b41a-338cdc0e977f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22914768-1216-46d8-b41a-338cdc0e977f\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d876d4ec8ae95b00698cc5f4700898c35211c994a02910946b2404f2e7b6a3a2/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.985825 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.985881 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:38:26 crc kubenswrapper[4914]: I0130 21:38:26.993883 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-22914768-1216-46d8-b41a-338cdc0e977f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22914768-1216-46d8-b41a-338cdc0e977f\") pod \"rabbitmq-cell1-server-0\" (UID: \"149d6b1c-dd4d-4433-906d-6774aeb77afb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:27 crc kubenswrapper[4914]: I0130 21:38:27.151101 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:38:27 crc kubenswrapper[4914]: I0130 21:38:27.219626 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" event={"ID":"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9","Type":"ContainerStarted","Data":"33adb07fecf4ddc8cce29fb617054e8458bfc4ba45828309b6b30818557d38c2"} Jan 30 21:38:27 crc kubenswrapper[4914]: I0130 21:38:27.219982 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:27 crc kubenswrapper[4914]: I0130 21:38:27.221515 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bc011821-8710-499b-8547-4ab18c9e2592","Type":"ContainerStarted","Data":"1e6a3eb4caedb5ee0fd24728e8a0da9f1e5f017df16414061f85f764a82e0d18"} Jan 30 21:38:27 crc kubenswrapper[4914]: I0130 21:38:27.227516 4914 generic.go:334] "Generic (PLEG): container finished" podID="fea8049a-f388-4d46-a567-473849787e27" containerID="362ef2b0b58bcd117734262403641f928628a9463750d7f586b6a62c67828b49" exitCode=143 Jan 30 21:38:27 crc kubenswrapper[4914]: I0130 21:38:27.227586 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"fea8049a-f388-4d46-a567-473849787e27","Type":"ContainerDied","Data":"362ef2b0b58bcd117734262403641f928628a9463750d7f586b6a62c67828b49"} Jan 30 21:38:27 crc kubenswrapper[4914]: I0130 21:38:27.233086 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49463c19-f32a-4288-9a5a-51d9c7b11e42","Type":"ContainerStarted","Data":"23e96833163100864a81d94dd171e7064bdc18dc47aaa47564745fddb1d51e0b"} Jan 30 21:38:27 crc kubenswrapper[4914]: I0130 21:38:27.233383 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 21:38:27 crc kubenswrapper[4914]: I0130 21:38:27.242927 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" podStartSLOduration=3.242908424 podStartE2EDuration="3.242908424s" podCreationTimestamp="2026-01-30 21:38:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:27.24111733 +0000 UTC m=+1440.679754081" watchObservedRunningTime="2026-01-30 21:38:27.242908424 +0000 UTC m=+1440.681545185" Jan 30 21:38:27 crc kubenswrapper[4914]: I0130 21:38:27.291676 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.722490388 podStartE2EDuration="13.291605264s" podCreationTimestamp="2026-01-30 21:38:14 +0000 UTC" firstStartedPulling="2026-01-30 21:38:14.974877583 +0000 UTC m=+1428.413514344" lastFinishedPulling="2026-01-30 21:38:26.543992459 +0000 UTC m=+1439.982629220" observedRunningTime="2026-01-30 21:38:27.284036689 +0000 UTC m=+1440.722673470" watchObservedRunningTime="2026-01-30 21:38:27.291605264 +0000 UTC m=+1440.730242025" Jan 30 21:38:27 crc kubenswrapper[4914]: I0130 21:38:27.482561 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:38:27 crc kubenswrapper[4914]: I0130 21:38:27.842504 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f394410a-5ff7-4a0c-84ec-4b60c63c707c" path="/var/lib/kubelet/pods/f394410a-5ff7-4a0c-84ec-4b60c63c707c/volumes" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.248997 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"149d6b1c-dd4d-4433-906d-6774aeb77afb","Type":"ContainerStarted","Data":"e8e4acc557432d1de84ecf27d80a8dbd656e446ec5f450750abdd71b228928fa"} Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.250894 4914 generic.go:334] "Generic (PLEG): container finished" podID="fea8049a-f388-4d46-a567-473849787e27" containerID="921990d8fa283366833ced4a3720567fbb50029acb68e25fcf13af86f5001b0f" exitCode=0 Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.250948 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"fea8049a-f388-4d46-a567-473849787e27","Type":"ContainerDied","Data":"921990d8fa283366833ced4a3720567fbb50029acb68e25fcf13af86f5001b0f"} Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.260130 4914 generic.go:334] "Generic (PLEG): container finished" podID="d069f103-1546-4a76-963e-2d160d5a347d" containerID="fdb7b1f8d5119dde9dc159739be80d70038da7d53ba436caed816511baa1b067" exitCode=0 Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.260454 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"d069f103-1546-4a76-963e-2d160d5a347d","Type":"ContainerDied","Data":"fdb7b1f8d5119dde9dc159739be80d70038da7d53ba436caed816511baa1b067"} Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.411115 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.543522 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d069f103-1546-4a76-963e-2d160d5a347d-certs\") pod \"d069f103-1546-4a76-963e-2d160d5a347d\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.543596 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-config-data-custom\") pod \"d069f103-1546-4a76-963e-2d160d5a347d\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.543672 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-combined-ca-bundle\") pod \"d069f103-1546-4a76-963e-2d160d5a347d\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.543735 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7tsk\" (UniqueName: \"kubernetes.io/projected/d069f103-1546-4a76-963e-2d160d5a347d-kube-api-access-v7tsk\") pod \"d069f103-1546-4a76-963e-2d160d5a347d\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.543787 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-config-data\") pod \"d069f103-1546-4a76-963e-2d160d5a347d\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.543810 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-scripts\") pod \"d069f103-1546-4a76-963e-2d160d5a347d\" (UID: \"d069f103-1546-4a76-963e-2d160d5a347d\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.575381 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d069f103-1546-4a76-963e-2d160d5a347d-certs" (OuterVolumeSpecName: "certs") pod "d069f103-1546-4a76-963e-2d160d5a347d" (UID: "d069f103-1546-4a76-963e-2d160d5a347d"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.575904 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-scripts" (OuterVolumeSpecName: "scripts") pod "d069f103-1546-4a76-963e-2d160d5a347d" (UID: "d069f103-1546-4a76-963e-2d160d5a347d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.579014 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d069f103-1546-4a76-963e-2d160d5a347d" (UID: "d069f103-1546-4a76-963e-2d160d5a347d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.645815 4914 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/d069f103-1546-4a76-963e-2d160d5a347d-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.646048 4914 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.646107 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.673240 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d069f103-1546-4a76-963e-2d160d5a347d-kube-api-access-v7tsk" (OuterVolumeSpecName: "kube-api-access-v7tsk") pod "d069f103-1546-4a76-963e-2d160d5a347d" (UID: "d069f103-1546-4a76-963e-2d160d5a347d"). InnerVolumeSpecName "kube-api-access-v7tsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.712403 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-config-data" (OuterVolumeSpecName: "config-data") pod "d069f103-1546-4a76-963e-2d160d5a347d" (UID: "d069f103-1546-4a76-963e-2d160d5a347d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.715034 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d069f103-1546-4a76-963e-2d160d5a347d" (UID: "d069f103-1546-4a76-963e-2d160d5a347d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.737264 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.748928 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.748994 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7tsk\" (UniqueName: \"kubernetes.io/projected/d069f103-1546-4a76-963e-2d160d5a347d-kube-api-access-v7tsk\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.749010 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d069f103-1546-4a76-963e-2d160d5a347d-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.850094 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-config-data\") pod \"fea8049a-f388-4d46-a567-473849787e27\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.850187 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-internal-tls-certs\") pod \"fea8049a-f388-4d46-a567-473849787e27\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.850326 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fea8049a-f388-4d46-a567-473849787e27-logs\") pod \"fea8049a-f388-4d46-a567-473849787e27\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.850351 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-scripts\") pod \"fea8049a-f388-4d46-a567-473849787e27\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.850398 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtrpz\" (UniqueName: \"kubernetes.io/projected/fea8049a-f388-4d46-a567-473849787e27-kube-api-access-gtrpz\") pod \"fea8049a-f388-4d46-a567-473849787e27\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.850428 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-combined-ca-bundle\") pod \"fea8049a-f388-4d46-a567-473849787e27\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.850504 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-config-data-custom\") pod \"fea8049a-f388-4d46-a567-473849787e27\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.850521 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fea8049a-f388-4d46-a567-473849787e27-certs\") pod \"fea8049a-f388-4d46-a567-473849787e27\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.850545 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-public-tls-certs\") pod \"fea8049a-f388-4d46-a567-473849787e27\" (UID: \"fea8049a-f388-4d46-a567-473849787e27\") " Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.851851 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fea8049a-f388-4d46-a567-473849787e27-logs" (OuterVolumeSpecName: "logs") pod "fea8049a-f388-4d46-a567-473849787e27" (UID: "fea8049a-f388-4d46-a567-473849787e27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.858750 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-scripts" (OuterVolumeSpecName: "scripts") pod "fea8049a-f388-4d46-a567-473849787e27" (UID: "fea8049a-f388-4d46-a567-473849787e27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.862073 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fea8049a-f388-4d46-a567-473849787e27-certs" (OuterVolumeSpecName: "certs") pod "fea8049a-f388-4d46-a567-473849787e27" (UID: "fea8049a-f388-4d46-a567-473849787e27"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.863210 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fea8049a-f388-4d46-a567-473849787e27-kube-api-access-gtrpz" (OuterVolumeSpecName: "kube-api-access-gtrpz") pod "fea8049a-f388-4d46-a567-473849787e27" (UID: "fea8049a-f388-4d46-a567-473849787e27"). InnerVolumeSpecName "kube-api-access-gtrpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.873850 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fea8049a-f388-4d46-a567-473849787e27" (UID: "fea8049a-f388-4d46-a567-473849787e27"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.897881 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-config-data" (OuterVolumeSpecName: "config-data") pod "fea8049a-f388-4d46-a567-473849787e27" (UID: "fea8049a-f388-4d46-a567-473849787e27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.903870 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fea8049a-f388-4d46-a567-473849787e27" (UID: "fea8049a-f388-4d46-a567-473849787e27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.928942 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fea8049a-f388-4d46-a567-473849787e27" (UID: "fea8049a-f388-4d46-a567-473849787e27"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.947794 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fea8049a-f388-4d46-a567-473849787e27" (UID: "fea8049a-f388-4d46-a567-473849787e27"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.952412 4914 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.952444 4914 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fea8049a-f388-4d46-a567-473849787e27-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.952455 4914 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.952463 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.952472 4914 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.952480 4914 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fea8049a-f388-4d46-a567-473849787e27-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.952487 4914 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.952497 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtrpz\" (UniqueName: \"kubernetes.io/projected/fea8049a-f388-4d46-a567-473849787e27-kube-api-access-gtrpz\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:28 crc kubenswrapper[4914]: I0130 21:38:28.952507 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea8049a-f388-4d46-a567-473849787e27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.273501 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bc011821-8710-499b-8547-4ab18c9e2592","Type":"ContainerStarted","Data":"2a82a0738717825be992f2277ab2e5c54714bbbd391ceaa7fe9b871587217620"} Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.275628 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"d069f103-1546-4a76-963e-2d160d5a347d","Type":"ContainerDied","Data":"aeb5d5d60a33865e48ce033f064f2ec0bbf42395dcefb15cbbd03c38ccf30e66"} Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.275669 4914 scope.go:117] "RemoveContainer" containerID="fdb7b1f8d5119dde9dc159739be80d70038da7d53ba436caed816511baa1b067" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.275669 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.280836 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"fea8049a-f388-4d46-a567-473849787e27","Type":"ContainerDied","Data":"aa8dfaf6cc5ed51e13e3e77713dfaf32dac431020f3e9210ddddb131e34802f2"} Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.280927 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.316342 4914 scope.go:117] "RemoveContainer" containerID="921990d8fa283366833ced4a3720567fbb50029acb68e25fcf13af86f5001b0f" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.340684 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.352233 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.367864 4914 scope.go:117] "RemoveContainer" containerID="362ef2b0b58bcd117734262403641f928628a9463750d7f586b6a62c67828b49" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.375869 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.386820 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.396921 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:38:29 crc kubenswrapper[4914]: E0130 21:38:29.397416 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea8049a-f388-4d46-a567-473849787e27" containerName="cloudkitty-api" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.397439 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea8049a-f388-4d46-a567-473849787e27" containerName="cloudkitty-api" Jan 30 21:38:29 crc kubenswrapper[4914]: E0130 21:38:29.397466 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d069f103-1546-4a76-963e-2d160d5a347d" containerName="cloudkitty-proc" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.397474 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="d069f103-1546-4a76-963e-2d160d5a347d" containerName="cloudkitty-proc" Jan 30 21:38:29 crc kubenswrapper[4914]: E0130 21:38:29.397511 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea8049a-f388-4d46-a567-473849787e27" containerName="cloudkitty-api-log" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.397519 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea8049a-f388-4d46-a567-473849787e27" containerName="cloudkitty-api-log" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.397804 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="d069f103-1546-4a76-963e-2d160d5a347d" containerName="cloudkitty-proc" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.397827 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea8049a-f388-4d46-a567-473849787e27" containerName="cloudkitty-api" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.397853 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea8049a-f388-4d46-a567-473849787e27" containerName="cloudkitty-api-log" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.399248 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.405682 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.406068 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.406186 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.406335 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.406470 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.406613 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.406869 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-tfj5s" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.406998 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.408744 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.412473 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.415836 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.428373 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.575792 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb76217-e54d-437c-91a9-170a095719ee-logs\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.575872 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-scripts\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.575987 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/022f86fb-3379-48ae-8987-348212c3e28e-scripts\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.576032 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mrp9\" (UniqueName: \"kubernetes.io/projected/022f86fb-3379-48ae-8987-348212c3e28e-kube-api-access-5mrp9\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.576216 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.576241 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd98w\" (UniqueName: \"kubernetes.io/projected/3fb76217-e54d-437c-91a9-170a095719ee-kube-api-access-sd98w\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.576279 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.576340 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.576370 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/022f86fb-3379-48ae-8987-348212c3e28e-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.576416 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.576452 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/022f86fb-3379-48ae-8987-348212c3e28e-config-data\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.576536 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/022f86fb-3379-48ae-8987-348212c3e28e-certs\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.576602 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-config-data\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.576661 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3fb76217-e54d-437c-91a9-170a095719ee-certs\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.576692 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/022f86fb-3379-48ae-8987-348212c3e28e-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.678924 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mrp9\" (UniqueName: \"kubernetes.io/projected/022f86fb-3379-48ae-8987-348212c3e28e-kube-api-access-5mrp9\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.679033 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.679057 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd98w\" (UniqueName: \"kubernetes.io/projected/3fb76217-e54d-437c-91a9-170a095719ee-kube-api-access-sd98w\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.679137 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.679192 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.679213 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/022f86fb-3379-48ae-8987-348212c3e28e-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.679248 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.679270 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/022f86fb-3379-48ae-8987-348212c3e28e-config-data\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.679309 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/022f86fb-3379-48ae-8987-348212c3e28e-certs\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.679352 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-config-data\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.679388 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3fb76217-e54d-437c-91a9-170a095719ee-certs\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.679411 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/022f86fb-3379-48ae-8987-348212c3e28e-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.679438 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb76217-e54d-437c-91a9-170a095719ee-logs\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.679475 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-scripts\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.679541 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/022f86fb-3379-48ae-8987-348212c3e28e-scripts\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.682070 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb76217-e54d-437c-91a9-170a095719ee-logs\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.698429 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.702403 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/022f86fb-3379-48ae-8987-348212c3e28e-certs\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.702936 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.703058 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/022f86fb-3379-48ae-8987-348212c3e28e-config-data\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.703411 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.706936 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-scripts\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.712010 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/022f86fb-3379-48ae-8987-348212c3e28e-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.716170 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/022f86fb-3379-48ae-8987-348212c3e28e-scripts\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.717053 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/022f86fb-3379-48ae-8987-348212c3e28e-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.719359 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.730357 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3fb76217-e54d-437c-91a9-170a095719ee-certs\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.738078 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb76217-e54d-437c-91a9-170a095719ee-config-data\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.743269 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd98w\" (UniqueName: \"kubernetes.io/projected/3fb76217-e54d-437c-91a9-170a095719ee-kube-api-access-sd98w\") pod \"cloudkitty-api-0\" (UID: \"3fb76217-e54d-437c-91a9-170a095719ee\") " pod="openstack/cloudkitty-api-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.749260 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mrp9\" (UniqueName: \"kubernetes.io/projected/022f86fb-3379-48ae-8987-348212c3e28e-kube-api-access-5mrp9\") pod \"cloudkitty-proc-0\" (UID: \"022f86fb-3379-48ae-8987-348212c3e28e\") " pod="openstack/cloudkitty-proc-0" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.829920 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d069f103-1546-4a76-963e-2d160d5a347d" path="/var/lib/kubelet/pods/d069f103-1546-4a76-963e-2d160d5a347d/volumes" Jan 30 21:38:29 crc kubenswrapper[4914]: I0130 21:38:29.830482 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fea8049a-f388-4d46-a567-473849787e27" path="/var/lib/kubelet/pods/fea8049a-f388-4d46-a567-473849787e27/volumes" Jan 30 21:38:30 crc kubenswrapper[4914]: I0130 21:38:30.027680 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 30 21:38:30 crc kubenswrapper[4914]: I0130 21:38:30.051168 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 30 21:38:30 crc kubenswrapper[4914]: I0130 21:38:30.349158 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"149d6b1c-dd4d-4433-906d-6774aeb77afb","Type":"ContainerStarted","Data":"d4c95fe6e7fbc44bc247c286fe5787cf0026bb9acdd24f614513d28ead0ddc4a"} Jan 30 21:38:30 crc kubenswrapper[4914]: I0130 21:38:30.603656 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 30 21:38:30 crc kubenswrapper[4914]: I0130 21:38:30.691060 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 30 21:38:30 crc kubenswrapper[4914]: W0130 21:38:30.693852 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod022f86fb_3379_48ae_8987_348212c3e28e.slice/crio-0435b15aa10c95a51bbd5629874fb4c0cc1fd1b8e79c7bd3ae9f3f6ae3f706a7 WatchSource:0}: Error finding container 0435b15aa10c95a51bbd5629874fb4c0cc1fd1b8e79c7bd3ae9f3f6ae3f706a7: Status 404 returned error can't find the container with id 0435b15aa10c95a51bbd5629874fb4c0cc1fd1b8e79c7bd3ae9f3f6ae3f706a7 Jan 30 21:38:31 crc kubenswrapper[4914]: I0130 21:38:31.362944 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"3fb76217-e54d-437c-91a9-170a095719ee","Type":"ContainerStarted","Data":"80814dbe244af39e8153d5bf2981c3d2d4c66caf2cbd431e2e15e0a5da876c89"} Jan 30 21:38:31 crc kubenswrapper[4914]: I0130 21:38:31.363322 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Jan 30 21:38:31 crc kubenswrapper[4914]: I0130 21:38:31.363339 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"3fb76217-e54d-437c-91a9-170a095719ee","Type":"ContainerStarted","Data":"d191ba876a27416543c8bcc0eb6422b273d1a3c80cb940d36841e683331fe116"} Jan 30 21:38:31 crc kubenswrapper[4914]: I0130 21:38:31.363370 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"3fb76217-e54d-437c-91a9-170a095719ee","Type":"ContainerStarted","Data":"d4395713c856e71b9f31c04105caa087be63744dd397eae62da5778b17553129"} Jan 30 21:38:31 crc kubenswrapper[4914]: I0130 21:38:31.365652 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"022f86fb-3379-48ae-8987-348212c3e28e","Type":"ContainerStarted","Data":"a1b0b9783117de4c5f58c5b5059ac674e5cb16a9ac10d29256234e10ccbd24fa"} Jan 30 21:38:31 crc kubenswrapper[4914]: I0130 21:38:31.365694 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"022f86fb-3379-48ae-8987-348212c3e28e","Type":"ContainerStarted","Data":"0435b15aa10c95a51bbd5629874fb4c0cc1fd1b8e79c7bd3ae9f3f6ae3f706a7"} Jan 30 21:38:31 crc kubenswrapper[4914]: I0130 21:38:31.398100 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.398079479 podStartE2EDuration="2.398079479s" podCreationTimestamp="2026-01-30 21:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:31.382116999 +0000 UTC m=+1444.820753760" watchObservedRunningTime="2026-01-30 21:38:31.398079479 +0000 UTC m=+1444.836716240" Jan 30 21:38:31 crc kubenswrapper[4914]: I0130 21:38:31.398321 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7kk2g" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="registry-server" probeResult="failure" output=< Jan 30 21:38:31 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:38:31 crc kubenswrapper[4914]: > Jan 30 21:38:34 crc kubenswrapper[4914]: I0130 21:38:34.799893 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:34 crc kubenswrapper[4914]: I0130 21:38:34.826730 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=5.5474474879999995 podStartE2EDuration="5.826687944s" podCreationTimestamp="2026-01-30 21:38:29 +0000 UTC" firstStartedPulling="2026-01-30 21:38:30.696166301 +0000 UTC m=+1444.134803062" lastFinishedPulling="2026-01-30 21:38:30.975406757 +0000 UTC m=+1444.414043518" observedRunningTime="2026-01-30 21:38:31.411115278 +0000 UTC m=+1444.849752039" watchObservedRunningTime="2026-01-30 21:38:34.826687944 +0000 UTC m=+1448.265324705" Jan 30 21:38:34 crc kubenswrapper[4914]: I0130 21:38:34.861441 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-5c58f"] Jan 30 21:38:34 crc kubenswrapper[4914]: I0130 21:38:34.861740 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" podUID="b4b59e65-251e-4633-8961-af93c0b108ce" containerName="dnsmasq-dns" containerID="cri-o://7d55f367f6c5b31b760a00857883921d36ad4779e2db6e4ff3c4418689116835" gracePeriod=10 Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.055134 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-h8dtr"] Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.057990 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.081203 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-h8dtr"] Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.218224 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.218482 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-dns-svc\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.218514 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-config\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.218536 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.218579 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmknk\" (UniqueName: \"kubernetes.io/projected/6de60584-391f-413b-b341-8abcd770eb7d-kube-api-access-bmknk\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.218600 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.218661 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.320117 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.320178 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-dns-svc\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.320209 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-config\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.320233 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.320276 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmknk\" (UniqueName: \"kubernetes.io/projected/6de60584-391f-413b-b341-8abcd770eb7d-kube-api-access-bmknk\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.320296 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.320358 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.321200 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.321766 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.321780 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-config\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.322305 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.322534 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.322825 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de60584-391f-413b-b341-8abcd770eb7d-dns-svc\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.345103 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmknk\" (UniqueName: \"kubernetes.io/projected/6de60584-391f-413b-b341-8abcd770eb7d-kube-api-access-bmknk\") pod \"dnsmasq-dns-85f64749dc-h8dtr\" (UID: \"6de60584-391f-413b-b341-8abcd770eb7d\") " pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.405719 4914 generic.go:334] "Generic (PLEG): container finished" podID="b4b59e65-251e-4633-8961-af93c0b108ce" containerID="7d55f367f6c5b31b760a00857883921d36ad4779e2db6e4ff3c4418689116835" exitCode=0 Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.405762 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" event={"ID":"b4b59e65-251e-4633-8961-af93c0b108ce","Type":"ContainerDied","Data":"7d55f367f6c5b31b760a00857883921d36ad4779e2db6e4ff3c4418689116835"} Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.418569 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.592374 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.731987 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-dns-svc\") pod \"b4b59e65-251e-4633-8961-af93c0b108ce\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.732138 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-ovsdbserver-sb\") pod \"b4b59e65-251e-4633-8961-af93c0b108ce\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.732475 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-dns-swift-storage-0\") pod \"b4b59e65-251e-4633-8961-af93c0b108ce\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.732561 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-config\") pod \"b4b59e65-251e-4633-8961-af93c0b108ce\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.732634 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghq2q\" (UniqueName: \"kubernetes.io/projected/b4b59e65-251e-4633-8961-af93c0b108ce-kube-api-access-ghq2q\") pod \"b4b59e65-251e-4633-8961-af93c0b108ce\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.732682 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-ovsdbserver-nb\") pod \"b4b59e65-251e-4633-8961-af93c0b108ce\" (UID: \"b4b59e65-251e-4633-8961-af93c0b108ce\") " Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.741569 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b59e65-251e-4633-8961-af93c0b108ce-kube-api-access-ghq2q" (OuterVolumeSpecName: "kube-api-access-ghq2q") pod "b4b59e65-251e-4633-8961-af93c0b108ce" (UID: "b4b59e65-251e-4633-8961-af93c0b108ce"). InnerVolumeSpecName "kube-api-access-ghq2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.797531 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b4b59e65-251e-4633-8961-af93c0b108ce" (UID: "b4b59e65-251e-4633-8961-af93c0b108ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.835392 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghq2q\" (UniqueName: \"kubernetes.io/projected/b4b59e65-251e-4633-8961-af93c0b108ce-kube-api-access-ghq2q\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.835415 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.845808 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b4b59e65-251e-4633-8961-af93c0b108ce" (UID: "b4b59e65-251e-4633-8961-af93c0b108ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.840961 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-config" (OuterVolumeSpecName: "config") pod "b4b59e65-251e-4633-8961-af93c0b108ce" (UID: "b4b59e65-251e-4633-8961-af93c0b108ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.866493 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b4b59e65-251e-4633-8961-af93c0b108ce" (UID: "b4b59e65-251e-4633-8961-af93c0b108ce"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.879181 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b4b59e65-251e-4633-8961-af93c0b108ce" (UID: "b4b59e65-251e-4633-8961-af93c0b108ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.937470 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.937504 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.937514 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.937524 4914 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4b59e65-251e-4633-8961-af93c0b108ce-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:35 crc kubenswrapper[4914]: I0130 21:38:35.964371 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-h8dtr"] Jan 30 21:38:35 crc kubenswrapper[4914]: W0130 21:38:35.969960 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6de60584_391f_413b_b341_8abcd770eb7d.slice/crio-a54f86bd83241edbcb6ad98505423070ee115cfaf498118e3291b93607f62bbf WatchSource:0}: Error finding container a54f86bd83241edbcb6ad98505423070ee115cfaf498118e3291b93607f62bbf: Status 404 returned error can't find the container with id a54f86bd83241edbcb6ad98505423070ee115cfaf498118e3291b93607f62bbf Jan 30 21:38:36 crc kubenswrapper[4914]: I0130 21:38:36.431257 4914 generic.go:334] "Generic (PLEG): container finished" podID="6de60584-391f-413b-b341-8abcd770eb7d" containerID="8e66e9213323945f3687be33aeaab584fdd386705fdd0af2e1f285865370e8ba" exitCode=0 Jan 30 21:38:36 crc kubenswrapper[4914]: I0130 21:38:36.431343 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" event={"ID":"6de60584-391f-413b-b341-8abcd770eb7d","Type":"ContainerDied","Data":"8e66e9213323945f3687be33aeaab584fdd386705fdd0af2e1f285865370e8ba"} Jan 30 21:38:36 crc kubenswrapper[4914]: I0130 21:38:36.431375 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" event={"ID":"6de60584-391f-413b-b341-8abcd770eb7d","Type":"ContainerStarted","Data":"a54f86bd83241edbcb6ad98505423070ee115cfaf498118e3291b93607f62bbf"} Jan 30 21:38:36 crc kubenswrapper[4914]: I0130 21:38:36.448663 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" event={"ID":"b4b59e65-251e-4633-8961-af93c0b108ce","Type":"ContainerDied","Data":"1706a513dfd35bac292c5642bc7fffbc18d602dc4687a951766358d23dc143c6"} Jan 30 21:38:36 crc kubenswrapper[4914]: I0130 21:38:36.449200 4914 scope.go:117] "RemoveContainer" containerID="7d55f367f6c5b31b760a00857883921d36ad4779e2db6e4ff3c4418689116835" Jan 30 21:38:36 crc kubenswrapper[4914]: I0130 21:38:36.449168 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-5c58f" Jan 30 21:38:36 crc kubenswrapper[4914]: I0130 21:38:36.655101 4914 scope.go:117] "RemoveContainer" containerID="0e309fb1c61c59b3d28900dd5b24929b5f96e0bdce1a951b9d3c8868383c76ba" Jan 30 21:38:36 crc kubenswrapper[4914]: I0130 21:38:36.655952 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-5c58f"] Jan 30 21:38:36 crc kubenswrapper[4914]: I0130 21:38:36.666593 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-5c58f"] Jan 30 21:38:37 crc kubenswrapper[4914]: I0130 21:38:37.460472 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" event={"ID":"6de60584-391f-413b-b341-8abcd770eb7d","Type":"ContainerStarted","Data":"401e8ac870c1060859a62ea9642b90f0d496361b705ff99e74233b553d0caf43"} Jan 30 21:38:37 crc kubenswrapper[4914]: I0130 21:38:37.460659 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:37 crc kubenswrapper[4914]: I0130 21:38:37.483520 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" podStartSLOduration=2.483502313 podStartE2EDuration="2.483502313s" podCreationTimestamp="2026-01-30 21:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:37.482109069 +0000 UTC m=+1450.920745860" watchObservedRunningTime="2026-01-30 21:38:37.483502313 +0000 UTC m=+1450.922139074" Jan 30 21:38:37 crc kubenswrapper[4914]: I0130 21:38:37.831388 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4b59e65-251e-4633-8961-af93c0b108ce" path="/var/lib/kubelet/pods/b4b59e65-251e-4633-8961-af93c0b108ce/volumes" Jan 30 21:38:40 crc kubenswrapper[4914]: I0130 21:38:40.804002 4914 scope.go:117] "RemoveContainer" containerID="004c6c908d0d5695cba3e148480eb2debb05fb64209a0e7961d73f9232c504b0" Jan 30 21:38:40 crc kubenswrapper[4914]: I0130 21:38:40.835205 4914 scope.go:117] "RemoveContainer" containerID="f8343c308380c5164c5ade6d747612b6694879f8d31de8cbd0fbfc60d77d1c07" Jan 30 21:38:41 crc kubenswrapper[4914]: I0130 21:38:41.368823 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7kk2g" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="registry-server" probeResult="failure" output=< Jan 30 21:38:41 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:38:41 crc kubenswrapper[4914]: > Jan 30 21:38:44 crc kubenswrapper[4914]: I0130 21:38:44.399645 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 21:38:45 crc kubenswrapper[4914]: I0130 21:38:45.420674 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85f64749dc-h8dtr" Jan 30 21:38:45 crc kubenswrapper[4914]: I0130 21:38:45.541285 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-txsxt"] Jan 30 21:38:45 crc kubenswrapper[4914]: I0130 21:38:45.541999 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" podUID="bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" containerName="dnsmasq-dns" containerID="cri-o://33adb07fecf4ddc8cce29fb617054e8458bfc4ba45828309b6b30818557d38c2" gracePeriod=10 Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.120820 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.319365 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-config\") pod \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.319434 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-dns-swift-storage-0\") pod \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.319457 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-dns-svc\") pod \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.319560 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmk5r\" (UniqueName: \"kubernetes.io/projected/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-kube-api-access-jmk5r\") pod \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.319609 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-ovsdbserver-nb\") pod \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.319661 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-ovsdbserver-sb\") pod \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.319717 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-openstack-edpm-ipam\") pod \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\" (UID: \"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9\") " Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.354016 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-kube-api-access-jmk5r" (OuterVolumeSpecName: "kube-api-access-jmk5r") pod "bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" (UID: "bb375cda-19fc-4e4c-ba96-cd120b0e7ca9"). InnerVolumeSpecName "kube-api-access-jmk5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.399670 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" (UID: "bb375cda-19fc-4e4c-ba96-cd120b0e7ca9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.407343 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" (UID: "bb375cda-19fc-4e4c-ba96-cd120b0e7ca9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.410830 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-config" (OuterVolumeSpecName: "config") pod "bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" (UID: "bb375cda-19fc-4e4c-ba96-cd120b0e7ca9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.422514 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmk5r\" (UniqueName: \"kubernetes.io/projected/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-kube-api-access-jmk5r\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.422541 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.422550 4914 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.422558 4914 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.427779 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" (UID: "bb375cda-19fc-4e4c-ba96-cd120b0e7ca9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.434788 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" (UID: "bb375cda-19fc-4e4c-ba96-cd120b0e7ca9"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.442146 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" (UID: "bb375cda-19fc-4e4c-ba96-cd120b0e7ca9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.524403 4914 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.524449 4914 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.524462 4914 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.571470 4914 generic.go:334] "Generic (PLEG): container finished" podID="bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" containerID="33adb07fecf4ddc8cce29fb617054e8458bfc4ba45828309b6b30818557d38c2" exitCode=0 Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.571509 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" event={"ID":"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9","Type":"ContainerDied","Data":"33adb07fecf4ddc8cce29fb617054e8458bfc4ba45828309b6b30818557d38c2"} Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.571557 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" event={"ID":"bb375cda-19fc-4e4c-ba96-cd120b0e7ca9","Type":"ContainerDied","Data":"043afc734258e55552240c547d49c63d2c2ef2fe142b5a8570be851cc8116de7"} Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.571569 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-txsxt" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.571579 4914 scope.go:117] "RemoveContainer" containerID="33adb07fecf4ddc8cce29fb617054e8458bfc4ba45828309b6b30818557d38c2" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.606869 4914 scope.go:117] "RemoveContainer" containerID="630d5d5eba8ac91f024a0e6350e773cf0cf27678782095d591fb8585f57c842a" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.614643 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-txsxt"] Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.623497 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-txsxt"] Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.649999 4914 scope.go:117] "RemoveContainer" containerID="33adb07fecf4ddc8cce29fb617054e8458bfc4ba45828309b6b30818557d38c2" Jan 30 21:38:46 crc kubenswrapper[4914]: E0130 21:38:46.650424 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33adb07fecf4ddc8cce29fb617054e8458bfc4ba45828309b6b30818557d38c2\": container with ID starting with 33adb07fecf4ddc8cce29fb617054e8458bfc4ba45828309b6b30818557d38c2 not found: ID does not exist" containerID="33adb07fecf4ddc8cce29fb617054e8458bfc4ba45828309b6b30818557d38c2" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.650462 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33adb07fecf4ddc8cce29fb617054e8458bfc4ba45828309b6b30818557d38c2"} err="failed to get container status \"33adb07fecf4ddc8cce29fb617054e8458bfc4ba45828309b6b30818557d38c2\": rpc error: code = NotFound desc = could not find container \"33adb07fecf4ddc8cce29fb617054e8458bfc4ba45828309b6b30818557d38c2\": container with ID starting with 33adb07fecf4ddc8cce29fb617054e8458bfc4ba45828309b6b30818557d38c2 not found: ID does not exist" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.650488 4914 scope.go:117] "RemoveContainer" containerID="630d5d5eba8ac91f024a0e6350e773cf0cf27678782095d591fb8585f57c842a" Jan 30 21:38:46 crc kubenswrapper[4914]: E0130 21:38:46.650734 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"630d5d5eba8ac91f024a0e6350e773cf0cf27678782095d591fb8585f57c842a\": container with ID starting with 630d5d5eba8ac91f024a0e6350e773cf0cf27678782095d591fb8585f57c842a not found: ID does not exist" containerID="630d5d5eba8ac91f024a0e6350e773cf0cf27678782095d591fb8585f57c842a" Jan 30 21:38:46 crc kubenswrapper[4914]: I0130 21:38:46.650762 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"630d5d5eba8ac91f024a0e6350e773cf0cf27678782095d591fb8585f57c842a"} err="failed to get container status \"630d5d5eba8ac91f024a0e6350e773cf0cf27678782095d591fb8585f57c842a\": rpc error: code = NotFound desc = could not find container \"630d5d5eba8ac91f024a0e6350e773cf0cf27678782095d591fb8585f57c842a\": container with ID starting with 630d5d5eba8ac91f024a0e6350e773cf0cf27678782095d591fb8585f57c842a not found: ID does not exist" Jan 30 21:38:47 crc kubenswrapper[4914]: I0130 21:38:47.828161 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" path="/var/lib/kubelet/pods/bb375cda-19fc-4e4c-ba96-cd120b0e7ca9/volumes" Jan 30 21:38:51 crc kubenswrapper[4914]: I0130 21:38:51.401828 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7kk2g" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="registry-server" probeResult="failure" output=< Jan 30 21:38:51 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:38:51 crc kubenswrapper[4914]: > Jan 30 21:38:56 crc kubenswrapper[4914]: I0130 21:38:56.982737 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:38:56 crc kubenswrapper[4914]: I0130 21:38:56.983461 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:38:56 crc kubenswrapper[4914]: I0130 21:38:56.983503 4914 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:38:56 crc kubenswrapper[4914]: I0130 21:38:56.984273 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"018dff8f009112f2d13f034fc24ae6b87f418ea17a0bfaeb82d8fef0d185a5d1"} pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:38:56 crc kubenswrapper[4914]: I0130 21:38:56.984323 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" containerID="cri-o://018dff8f009112f2d13f034fc24ae6b87f418ea17a0bfaeb82d8fef0d185a5d1" gracePeriod=600 Jan 30 21:38:57 crc kubenswrapper[4914]: I0130 21:38:57.687451 4914 generic.go:334] "Generic (PLEG): container finished" podID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerID="018dff8f009112f2d13f034fc24ae6b87f418ea17a0bfaeb82d8fef0d185a5d1" exitCode=0 Jan 30 21:38:57 crc kubenswrapper[4914]: I0130 21:38:57.687528 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerDied","Data":"018dff8f009112f2d13f034fc24ae6b87f418ea17a0bfaeb82d8fef0d185a5d1"} Jan 30 21:38:57 crc kubenswrapper[4914]: I0130 21:38:57.688319 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56"} Jan 30 21:38:57 crc kubenswrapper[4914]: I0130 21:38:57.688397 4914 scope.go:117] "RemoveContainer" containerID="f0fa301f4a7d6f2d2094968ff039d7aedbb13e612ee90301cf0076f1904de139" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.360603 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj"] Jan 30 21:38:58 crc kubenswrapper[4914]: E0130 21:38:58.361973 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" containerName="init" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.361992 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" containerName="init" Jan 30 21:38:58 crc kubenswrapper[4914]: E0130 21:38:58.362012 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b59e65-251e-4633-8961-af93c0b108ce" containerName="dnsmasq-dns" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.362019 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b59e65-251e-4633-8961-af93c0b108ce" containerName="dnsmasq-dns" Jan 30 21:38:58 crc kubenswrapper[4914]: E0130 21:38:58.362043 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" containerName="dnsmasq-dns" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.362051 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" containerName="dnsmasq-dns" Jan 30 21:38:58 crc kubenswrapper[4914]: E0130 21:38:58.362090 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b59e65-251e-4633-8961-af93c0b108ce" containerName="init" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.362102 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b59e65-251e-4633-8961-af93c0b108ce" containerName="init" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.362355 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb375cda-19fc-4e4c-ba96-cd120b0e7ca9" containerName="dnsmasq-dns" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.362374 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b59e65-251e-4633-8961-af93c0b108ce" containerName="dnsmasq-dns" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.363537 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.367438 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.368361 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.368945 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.369030 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.372392 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj"] Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.508531 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.508602 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t44t\" (UniqueName: \"kubernetes.io/projected/59c442fc-77b4-430b-8522-86705c6f7d3c-kube-api-access-5t44t\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.508883 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.509248 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.611663 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.611725 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t44t\" (UniqueName: \"kubernetes.io/projected/59c442fc-77b4-430b-8522-86705c6f7d3c-kube-api-access-5t44t\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.611801 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.611856 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.617907 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.617992 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.620157 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.634645 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t44t\" (UniqueName: \"kubernetes.io/projected/59c442fc-77b4-430b-8522-86705c6f7d3c-kube-api-access-5t44t\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:38:58 crc kubenswrapper[4914]: I0130 21:38:58.685007 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:38:59 crc kubenswrapper[4914]: W0130 21:38:59.261550 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59c442fc_77b4_430b_8522_86705c6f7d3c.slice/crio-59f72ba8451fd2c8b516713768ef25eb75355b45f0662edad81adb25e8465594 WatchSource:0}: Error finding container 59f72ba8451fd2c8b516713768ef25eb75355b45f0662edad81adb25e8465594: Status 404 returned error can't find the container with id 59f72ba8451fd2c8b516713768ef25eb75355b45f0662edad81adb25e8465594 Jan 30 21:38:59 crc kubenswrapper[4914]: I0130 21:38:59.267670 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj"] Jan 30 21:38:59 crc kubenswrapper[4914]: I0130 21:38:59.712761 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" event={"ID":"59c442fc-77b4-430b-8522-86705c6f7d3c","Type":"ContainerStarted","Data":"59f72ba8451fd2c8b516713768ef25eb75355b45f0662edad81adb25e8465594"} Jan 30 21:39:00 crc kubenswrapper[4914]: I0130 21:39:00.731234 4914 generic.go:334] "Generic (PLEG): container finished" podID="bc011821-8710-499b-8547-4ab18c9e2592" containerID="2a82a0738717825be992f2277ab2e5c54714bbbd391ceaa7fe9b871587217620" exitCode=0 Jan 30 21:39:00 crc kubenswrapper[4914]: I0130 21:39:00.731344 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bc011821-8710-499b-8547-4ab18c9e2592","Type":"ContainerDied","Data":"2a82a0738717825be992f2277ab2e5c54714bbbd391ceaa7fe9b871587217620"} Jan 30 21:39:01 crc kubenswrapper[4914]: I0130 21:39:01.377597 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7kk2g" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="registry-server" probeResult="failure" output=< Jan 30 21:39:01 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:39:01 crc kubenswrapper[4914]: > Jan 30 21:39:01 crc kubenswrapper[4914]: I0130 21:39:01.742179 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bc011821-8710-499b-8547-4ab18c9e2592","Type":"ContainerStarted","Data":"fa075fd990d29fccffd47119e05bd3f240769da5064ef2a95bf8d1844c287522"} Jan 30 21:39:01 crc kubenswrapper[4914]: I0130 21:39:01.742366 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 30 21:39:01 crc kubenswrapper[4914]: I0130 21:39:01.743567 4914 generic.go:334] "Generic (PLEG): container finished" podID="149d6b1c-dd4d-4433-906d-6774aeb77afb" containerID="d4c95fe6e7fbc44bc247c286fe5787cf0026bb9acdd24f614513d28ead0ddc4a" exitCode=0 Jan 30 21:39:01 crc kubenswrapper[4914]: I0130 21:39:01.743605 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"149d6b1c-dd4d-4433-906d-6774aeb77afb","Type":"ContainerDied","Data":"d4c95fe6e7fbc44bc247c286fe5787cf0026bb9acdd24f614513d28ead0ddc4a"} Jan 30 21:39:01 crc kubenswrapper[4914]: I0130 21:39:01.775345 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.775324234 podStartE2EDuration="36.775324234s" podCreationTimestamp="2026-01-30 21:38:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:39:01.766622261 +0000 UTC m=+1475.205259012" watchObservedRunningTime="2026-01-30 21:39:01.775324234 +0000 UTC m=+1475.213961025" Jan 30 21:39:02 crc kubenswrapper[4914]: I0130 21:39:02.775833 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"149d6b1c-dd4d-4433-906d-6774aeb77afb","Type":"ContainerStarted","Data":"1fa5ac3855d62b18296c10cdf8deb2eb38febb1fb5fa38b5bd98e54facd9e1ed"} Jan 30 21:39:02 crc kubenswrapper[4914]: I0130 21:39:02.778291 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:39:02 crc kubenswrapper[4914]: I0130 21:39:02.806620 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.806597854 podStartE2EDuration="36.806597854s" podCreationTimestamp="2026-01-30 21:38:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:39:02.80151544 +0000 UTC m=+1476.240152201" watchObservedRunningTime="2026-01-30 21:39:02.806597854 +0000 UTC m=+1476.245234615" Jan 30 21:39:10 crc kubenswrapper[4914]: I0130 21:39:10.044924 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-api-0" podUID="3fb76217-e54d-437c-91a9-170a095719ee" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.238:8889/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:39:10 crc kubenswrapper[4914]: I0130 21:39:10.044945 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cloudkitty-api-0" podUID="3fb76217-e54d-437c-91a9-170a095719ee" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.238:8889/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:39:11 crc kubenswrapper[4914]: I0130 21:39:11.378085 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7kk2g" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="registry-server" probeResult="failure" output=< Jan 30 21:39:11 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:39:11 crc kubenswrapper[4914]: > Jan 30 21:39:15 crc kubenswrapper[4914]: I0130 21:39:15.054912 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cloudkitty-api-0" podUID="3fb76217-e54d-437c-91a9-170a095719ee" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.238:8889/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:39:15 crc kubenswrapper[4914]: I0130 21:39:15.054945 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-api-0" podUID="3fb76217-e54d-437c-91a9-170a095719ee" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.238:8889/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 21:39:15 crc kubenswrapper[4914]: E0130 21:39:15.887382 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest" Jan 30 21:39:15 crc kubenswrapper[4914]: E0130 21:39:15.887594 4914 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 21:39:15 crc kubenswrapper[4914]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Jan 30 21:39:15 crc kubenswrapper[4914]: - hosts: all Jan 30 21:39:15 crc kubenswrapper[4914]: strategy: linear Jan 30 21:39:15 crc kubenswrapper[4914]: tasks: Jan 30 21:39:15 crc kubenswrapper[4914]: - name: Enable podified-repos Jan 30 21:39:15 crc kubenswrapper[4914]: become: true Jan 30 21:39:15 crc kubenswrapper[4914]: ansible.builtin.shell: | Jan 30 21:39:15 crc kubenswrapper[4914]: set -euxo pipefail Jan 30 21:39:15 crc kubenswrapper[4914]: pushd /var/tmp Jan 30 21:39:15 crc kubenswrapper[4914]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Jan 30 21:39:15 crc kubenswrapper[4914]: pushd repo-setup-main Jan 30 21:39:15 crc kubenswrapper[4914]: python3 -m venv ./venv Jan 30 21:39:15 crc kubenswrapper[4914]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Jan 30 21:39:15 crc kubenswrapper[4914]: ./venv/bin/repo-setup current-podified -b antelope Jan 30 21:39:15 crc kubenswrapper[4914]: popd Jan 30 21:39:15 crc kubenswrapper[4914]: rm -rf repo-setup-main Jan 30 21:39:15 crc kubenswrapper[4914]: Jan 30 21:39:15 crc kubenswrapper[4914]: Jan 30 21:39:15 crc kubenswrapper[4914]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Jan 30 21:39:15 crc kubenswrapper[4914]: edpm_override_hosts: openstack-edpm-ipam Jan 30 21:39:15 crc kubenswrapper[4914]: edpm_service_type: repo-setup Jan 30 21:39:15 crc kubenswrapper[4914]: Jan 30 21:39:15 crc kubenswrapper[4914]: Jan 30 21:39:15 crc kubenswrapper[4914]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key-openstack-edpm-ipam,ReadOnly:false,MountPath:/runner/env/ssh_key/ssh_key_openstack-edpm-ipam,SubPath:ssh_key_openstack-edpm-ipam,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5t44t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj_openstack(59c442fc-77b4-430b-8522-86705c6f7d3c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Jan 30 21:39:15 crc kubenswrapper[4914]: > logger="UnhandledError" Jan 30 21:39:15 crc kubenswrapper[4914]: E0130 21:39:15.889080 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" podUID="59c442fc-77b4-430b-8522-86705c6f7d3c" Jan 30 21:39:15 crc kubenswrapper[4914]: E0130 21:39:15.930150 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" podUID="59c442fc-77b4-430b-8522-86705c6f7d3c" Jan 30 21:39:15 crc kubenswrapper[4914]: I0130 21:39:15.955319 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="bc011821-8710-499b-8547-4ab18c9e2592" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.236:5671: connect: connection refused" Jan 30 21:39:16 crc kubenswrapper[4914]: I0130 21:39:16.192594 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Jan 30 21:39:17 crc kubenswrapper[4914]: I0130 21:39:17.153122 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="149d6b1c-dd4d-4433-906d-6774aeb77afb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.237:5671: connect: connection refused" Jan 30 21:39:21 crc kubenswrapper[4914]: I0130 21:39:21.379309 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7kk2g" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="registry-server" probeResult="failure" output=< Jan 30 21:39:21 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:39:21 crc kubenswrapper[4914]: > Jan 30 21:39:25 crc kubenswrapper[4914]: I0130 21:39:25.957146 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 30 21:39:27 crc kubenswrapper[4914]: I0130 21:39:27.152918 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:39:29 crc kubenswrapper[4914]: I0130 21:39:29.567026 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:39:30 crc kubenswrapper[4914]: I0130 21:39:30.343664 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" event={"ID":"59c442fc-77b4-430b-8522-86705c6f7d3c","Type":"ContainerStarted","Data":"6a7ae913839aa0a88b0da868b66af83325d763e0b18b9c7ff0cfdd7357c0510d"} Jan 30 21:39:30 crc kubenswrapper[4914]: I0130 21:39:30.378646 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" podStartSLOduration=2.07807513 podStartE2EDuration="32.378619903s" podCreationTimestamp="2026-01-30 21:38:58 +0000 UTC" firstStartedPulling="2026-01-30 21:38:59.264218118 +0000 UTC m=+1472.702854889" lastFinishedPulling="2026-01-30 21:39:29.564762901 +0000 UTC m=+1503.003399662" observedRunningTime="2026-01-30 21:39:30.373296853 +0000 UTC m=+1503.811933614" watchObservedRunningTime="2026-01-30 21:39:30.378619903 +0000 UTC m=+1503.817256664" Jan 30 21:39:31 crc kubenswrapper[4914]: I0130 21:39:31.398654 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7kk2g" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="registry-server" probeResult="failure" output=< Jan 30 21:39:31 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 21:39:31 crc kubenswrapper[4914]: > Jan 30 21:39:40 crc kubenswrapper[4914]: I0130 21:39:40.389902 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:39:40 crc kubenswrapper[4914]: I0130 21:39:40.493558 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:39:40 crc kubenswrapper[4914]: I0130 21:39:40.637205 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7kk2g"] Jan 30 21:39:40 crc kubenswrapper[4914]: I0130 21:39:40.986009 4914 scope.go:117] "RemoveContainer" containerID="70cbbbbafe8cea99c64866db7ecbc122e676e7f4c10a9cddda720373900c0b08" Jan 30 21:39:41 crc kubenswrapper[4914]: I0130 21:39:41.020614 4914 scope.go:117] "RemoveContainer" containerID="ddd6ea585279fb45564d13808e7fcb6502f0356ac4893d7eb01fcf04b4dda22a" Jan 30 21:39:41 crc kubenswrapper[4914]: I0130 21:39:41.193404 4914 scope.go:117] "RemoveContainer" containerID="81433ff0f628af87671a18cfd72c656192e9c4f0ecad8d43815099cf1bfc51c1" Jan 30 21:39:41 crc kubenswrapper[4914]: I0130 21:39:41.237752 4914 scope.go:117] "RemoveContainer" containerID="272968fda7cc662786817a7a393a09260918e5cf25029cbdea32b8853cb52967" Jan 30 21:39:41 crc kubenswrapper[4914]: I0130 21:39:41.448993 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7kk2g" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="registry-server" containerID="cri-o://1efcced53d4c59f33eecabc32c02a82a327a73342e8b369fa2aa989e0130df29" gracePeriod=2 Jan 30 21:39:42 crc kubenswrapper[4914]: I0130 21:39:42.461554 4914 generic.go:334] "Generic (PLEG): container finished" podID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerID="1efcced53d4c59f33eecabc32c02a82a327a73342e8b369fa2aa989e0130df29" exitCode=0 Jan 30 21:39:42 crc kubenswrapper[4914]: I0130 21:39:42.461804 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kk2g" event={"ID":"ae495b2b-b99b-4051-bd64-c54667d4d9bc","Type":"ContainerDied","Data":"1efcced53d4c59f33eecabc32c02a82a327a73342e8b369fa2aa989e0130df29"} Jan 30 21:39:42 crc kubenswrapper[4914]: I0130 21:39:42.706097 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:39:42 crc kubenswrapper[4914]: I0130 21:39:42.819625 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae495b2b-b99b-4051-bd64-c54667d4d9bc-utilities\") pod \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\" (UID: \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\") " Jan 30 21:39:42 crc kubenswrapper[4914]: I0130 21:39:42.819740 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae495b2b-b99b-4051-bd64-c54667d4d9bc-catalog-content\") pod \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\" (UID: \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\") " Jan 30 21:39:42 crc kubenswrapper[4914]: I0130 21:39:42.820007 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzcbr\" (UniqueName: \"kubernetes.io/projected/ae495b2b-b99b-4051-bd64-c54667d4d9bc-kube-api-access-lzcbr\") pod \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\" (UID: \"ae495b2b-b99b-4051-bd64-c54667d4d9bc\") " Jan 30 21:39:42 crc kubenswrapper[4914]: I0130 21:39:42.820984 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae495b2b-b99b-4051-bd64-c54667d4d9bc-utilities" (OuterVolumeSpecName: "utilities") pod "ae495b2b-b99b-4051-bd64-c54667d4d9bc" (UID: "ae495b2b-b99b-4051-bd64-c54667d4d9bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:39:42 crc kubenswrapper[4914]: I0130 21:39:42.824351 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae495b2b-b99b-4051-bd64-c54667d4d9bc-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:42 crc kubenswrapper[4914]: I0130 21:39:42.842951 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae495b2b-b99b-4051-bd64-c54667d4d9bc-kube-api-access-lzcbr" (OuterVolumeSpecName: "kube-api-access-lzcbr") pod "ae495b2b-b99b-4051-bd64-c54667d4d9bc" (UID: "ae495b2b-b99b-4051-bd64-c54667d4d9bc"). InnerVolumeSpecName "kube-api-access-lzcbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:39:42 crc kubenswrapper[4914]: I0130 21:39:42.927329 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzcbr\" (UniqueName: \"kubernetes.io/projected/ae495b2b-b99b-4051-bd64-c54667d4d9bc-kube-api-access-lzcbr\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:42 crc kubenswrapper[4914]: I0130 21:39:42.944202 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae495b2b-b99b-4051-bd64-c54667d4d9bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae495b2b-b99b-4051-bd64-c54667d4d9bc" (UID: "ae495b2b-b99b-4051-bd64-c54667d4d9bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:39:43 crc kubenswrapper[4914]: I0130 21:39:43.030167 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae495b2b-b99b-4051-bd64-c54667d4d9bc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:43 crc kubenswrapper[4914]: I0130 21:39:43.486434 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kk2g" event={"ID":"ae495b2b-b99b-4051-bd64-c54667d4d9bc","Type":"ContainerDied","Data":"ecb965cdf9835ab877c786fec32f546a74d234cf7b7c51d52720619d4905dbf6"} Jan 30 21:39:43 crc kubenswrapper[4914]: I0130 21:39:43.486520 4914 scope.go:117] "RemoveContainer" containerID="1efcced53d4c59f33eecabc32c02a82a327a73342e8b369fa2aa989e0130df29" Jan 30 21:39:43 crc kubenswrapper[4914]: I0130 21:39:43.486816 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7kk2g" Jan 30 21:39:43 crc kubenswrapper[4914]: I0130 21:39:43.519813 4914 scope.go:117] "RemoveContainer" containerID="3fc7c6999454190607a44a3782b41950bd59fa77c68315821e44554b60bc5da1" Jan 30 21:39:43 crc kubenswrapper[4914]: I0130 21:39:43.548868 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7kk2g"] Jan 30 21:39:43 crc kubenswrapper[4914]: I0130 21:39:43.555511 4914 scope.go:117] "RemoveContainer" containerID="1496e7b71ea268b10c71328e8c3cd10d6a54f8cedabb613af827c43f3aa97748" Jan 30 21:39:43 crc kubenswrapper[4914]: I0130 21:39:43.556768 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7kk2g"] Jan 30 21:39:43 crc kubenswrapper[4914]: I0130 21:39:43.843981 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" path="/var/lib/kubelet/pods/ae495b2b-b99b-4051-bd64-c54667d4d9bc/volumes" Jan 30 21:39:45 crc kubenswrapper[4914]: I0130 21:39:45.506984 4914 generic.go:334] "Generic (PLEG): container finished" podID="59c442fc-77b4-430b-8522-86705c6f7d3c" containerID="6a7ae913839aa0a88b0da868b66af83325d763e0b18b9c7ff0cfdd7357c0510d" exitCode=0 Jan 30 21:39:45 crc kubenswrapper[4914]: I0130 21:39:45.507044 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" event={"ID":"59c442fc-77b4-430b-8522-86705c6f7d3c","Type":"ContainerDied","Data":"6a7ae913839aa0a88b0da868b66af83325d763e0b18b9c7ff0cfdd7357c0510d"} Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.144330 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.342657 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t44t\" (UniqueName: \"kubernetes.io/projected/59c442fc-77b4-430b-8522-86705c6f7d3c-kube-api-access-5t44t\") pod \"59c442fc-77b4-430b-8522-86705c6f7d3c\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.342861 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-inventory\") pod \"59c442fc-77b4-430b-8522-86705c6f7d3c\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.342944 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-repo-setup-combined-ca-bundle\") pod \"59c442fc-77b4-430b-8522-86705c6f7d3c\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.343050 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-ssh-key-openstack-edpm-ipam\") pod \"59c442fc-77b4-430b-8522-86705c6f7d3c\" (UID: \"59c442fc-77b4-430b-8522-86705c6f7d3c\") " Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.361022 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "59c442fc-77b4-430b-8522-86705c6f7d3c" (UID: "59c442fc-77b4-430b-8522-86705c6f7d3c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.361033 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59c442fc-77b4-430b-8522-86705c6f7d3c-kube-api-access-5t44t" (OuterVolumeSpecName: "kube-api-access-5t44t") pod "59c442fc-77b4-430b-8522-86705c6f7d3c" (UID: "59c442fc-77b4-430b-8522-86705c6f7d3c"). InnerVolumeSpecName "kube-api-access-5t44t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.375063 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-inventory" (OuterVolumeSpecName: "inventory") pod "59c442fc-77b4-430b-8522-86705c6f7d3c" (UID: "59c442fc-77b4-430b-8522-86705c6f7d3c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.396339 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "59c442fc-77b4-430b-8522-86705c6f7d3c" (UID: "59c442fc-77b4-430b-8522-86705c6f7d3c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.447852 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t44t\" (UniqueName: \"kubernetes.io/projected/59c442fc-77b4-430b-8522-86705c6f7d3c-kube-api-access-5t44t\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.447918 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.447941 4914 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.447962 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c442fc-77b4-430b-8522-86705c6f7d3c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.538190 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" event={"ID":"59c442fc-77b4-430b-8522-86705c6f7d3c","Type":"ContainerDied","Data":"59f72ba8451fd2c8b516713768ef25eb75355b45f0662edad81adb25e8465594"} Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.538227 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59f72ba8451fd2c8b516713768ef25eb75355b45f0662edad81adb25e8465594" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.538245 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.626729 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv"] Jan 30 21:39:47 crc kubenswrapper[4914]: E0130 21:39:47.627246 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="extract-content" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.627267 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="extract-content" Jan 30 21:39:47 crc kubenswrapper[4914]: E0130 21:39:47.627305 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c442fc-77b4-430b-8522-86705c6f7d3c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.627314 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c442fc-77b4-430b-8522-86705c6f7d3c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 30 21:39:47 crc kubenswrapper[4914]: E0130 21:39:47.627330 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="registry-server" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.627338 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="registry-server" Jan 30 21:39:47 crc kubenswrapper[4914]: E0130 21:39:47.627354 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="extract-utilities" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.627364 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="extract-utilities" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.627577 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae495b2b-b99b-4051-bd64-c54667d4d9bc" containerName="registry-server" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.627618 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="59c442fc-77b4-430b-8522-86705c6f7d3c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.628577 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.631315 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.631436 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.631892 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.642985 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.645169 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv"] Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.752906 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/672f23d1-408d-4b3e-9068-66faf28b06bb-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jvkxv\" (UID: \"672f23d1-408d-4b3e-9068-66faf28b06bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.752985 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/672f23d1-408d-4b3e-9068-66faf28b06bb-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jvkxv\" (UID: \"672f23d1-408d-4b3e-9068-66faf28b06bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.753044 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8dlk\" (UniqueName: \"kubernetes.io/projected/672f23d1-408d-4b3e-9068-66faf28b06bb-kube-api-access-h8dlk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jvkxv\" (UID: \"672f23d1-408d-4b3e-9068-66faf28b06bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.861496 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/672f23d1-408d-4b3e-9068-66faf28b06bb-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jvkxv\" (UID: \"672f23d1-408d-4b3e-9068-66faf28b06bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.862277 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/672f23d1-408d-4b3e-9068-66faf28b06bb-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jvkxv\" (UID: \"672f23d1-408d-4b3e-9068-66faf28b06bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.862428 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8dlk\" (UniqueName: \"kubernetes.io/projected/672f23d1-408d-4b3e-9068-66faf28b06bb-kube-api-access-h8dlk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jvkxv\" (UID: \"672f23d1-408d-4b3e-9068-66faf28b06bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.866361 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/672f23d1-408d-4b3e-9068-66faf28b06bb-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jvkxv\" (UID: \"672f23d1-408d-4b3e-9068-66faf28b06bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.866367 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/672f23d1-408d-4b3e-9068-66faf28b06bb-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jvkxv\" (UID: \"672f23d1-408d-4b3e-9068-66faf28b06bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.879329 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8dlk\" (UniqueName: \"kubernetes.io/projected/672f23d1-408d-4b3e-9068-66faf28b06bb-kube-api-access-h8dlk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jvkxv\" (UID: \"672f23d1-408d-4b3e-9068-66faf28b06bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" Jan 30 21:39:47 crc kubenswrapper[4914]: I0130 21:39:47.953346 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" Jan 30 21:39:48 crc kubenswrapper[4914]: I0130 21:39:48.489733 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv"] Jan 30 21:39:48 crc kubenswrapper[4914]: W0130 21:39:48.490929 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod672f23d1_408d_4b3e_9068_66faf28b06bb.slice/crio-2067c5446e1bdc9562fb6f43c303582d31cff59b60f8d37b4d1872c95b75464b WatchSource:0}: Error finding container 2067c5446e1bdc9562fb6f43c303582d31cff59b60f8d37b4d1872c95b75464b: Status 404 returned error can't find the container with id 2067c5446e1bdc9562fb6f43c303582d31cff59b60f8d37b4d1872c95b75464b Jan 30 21:39:48 crc kubenswrapper[4914]: I0130 21:39:48.551588 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" event={"ID":"672f23d1-408d-4b3e-9068-66faf28b06bb","Type":"ContainerStarted","Data":"2067c5446e1bdc9562fb6f43c303582d31cff59b60f8d37b4d1872c95b75464b"} Jan 30 21:39:49 crc kubenswrapper[4914]: I0130 21:39:49.561693 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" event={"ID":"672f23d1-408d-4b3e-9068-66faf28b06bb","Type":"ContainerStarted","Data":"98a653c442de44396f79b7369d48cf5c0990bc7ec8cef5e35cc2f054ac9907e8"} Jan 30 21:39:49 crc kubenswrapper[4914]: I0130 21:39:49.580420 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" podStartSLOduration=2.078718404 podStartE2EDuration="2.580399619s" podCreationTimestamp="2026-01-30 21:39:47 +0000 UTC" firstStartedPulling="2026-01-30 21:39:48.493179458 +0000 UTC m=+1521.931816229" lastFinishedPulling="2026-01-30 21:39:48.994860693 +0000 UTC m=+1522.433497444" observedRunningTime="2026-01-30 21:39:49.574160016 +0000 UTC m=+1523.012796777" watchObservedRunningTime="2026-01-30 21:39:49.580399619 +0000 UTC m=+1523.019036380" Jan 30 21:39:52 crc kubenswrapper[4914]: I0130 21:39:52.588347 4914 generic.go:334] "Generic (PLEG): container finished" podID="672f23d1-408d-4b3e-9068-66faf28b06bb" containerID="98a653c442de44396f79b7369d48cf5c0990bc7ec8cef5e35cc2f054ac9907e8" exitCode=0 Jan 30 21:39:52 crc kubenswrapper[4914]: I0130 21:39:52.588434 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" event={"ID":"672f23d1-408d-4b3e-9068-66faf28b06bb","Type":"ContainerDied","Data":"98a653c442de44396f79b7369d48cf5c0990bc7ec8cef5e35cc2f054ac9907e8"} Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.120204 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.286069 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/672f23d1-408d-4b3e-9068-66faf28b06bb-ssh-key-openstack-edpm-ipam\") pod \"672f23d1-408d-4b3e-9068-66faf28b06bb\" (UID: \"672f23d1-408d-4b3e-9068-66faf28b06bb\") " Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.286173 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/672f23d1-408d-4b3e-9068-66faf28b06bb-inventory\") pod \"672f23d1-408d-4b3e-9068-66faf28b06bb\" (UID: \"672f23d1-408d-4b3e-9068-66faf28b06bb\") " Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.286204 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8dlk\" (UniqueName: \"kubernetes.io/projected/672f23d1-408d-4b3e-9068-66faf28b06bb-kube-api-access-h8dlk\") pod \"672f23d1-408d-4b3e-9068-66faf28b06bb\" (UID: \"672f23d1-408d-4b3e-9068-66faf28b06bb\") " Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.291326 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/672f23d1-408d-4b3e-9068-66faf28b06bb-kube-api-access-h8dlk" (OuterVolumeSpecName: "kube-api-access-h8dlk") pod "672f23d1-408d-4b3e-9068-66faf28b06bb" (UID: "672f23d1-408d-4b3e-9068-66faf28b06bb"). InnerVolumeSpecName "kube-api-access-h8dlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.318744 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/672f23d1-408d-4b3e-9068-66faf28b06bb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "672f23d1-408d-4b3e-9068-66faf28b06bb" (UID: "672f23d1-408d-4b3e-9068-66faf28b06bb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.331591 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/672f23d1-408d-4b3e-9068-66faf28b06bb-inventory" (OuterVolumeSpecName: "inventory") pod "672f23d1-408d-4b3e-9068-66faf28b06bb" (UID: "672f23d1-408d-4b3e-9068-66faf28b06bb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.388753 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/672f23d1-408d-4b3e-9068-66faf28b06bb-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.388786 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8dlk\" (UniqueName: \"kubernetes.io/projected/672f23d1-408d-4b3e-9068-66faf28b06bb-kube-api-access-h8dlk\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.388796 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/672f23d1-408d-4b3e-9068-66faf28b06bb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.607318 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" event={"ID":"672f23d1-408d-4b3e-9068-66faf28b06bb","Type":"ContainerDied","Data":"2067c5446e1bdc9562fb6f43c303582d31cff59b60f8d37b4d1872c95b75464b"} Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.607563 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2067c5446e1bdc9562fb6f43c303582d31cff59b60f8d37b4d1872c95b75464b" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.607360 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jvkxv" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.694450 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm"] Jan 30 21:39:54 crc kubenswrapper[4914]: E0130 21:39:54.694978 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="672f23d1-408d-4b3e-9068-66faf28b06bb" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.695006 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="672f23d1-408d-4b3e-9068-66faf28b06bb" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.695281 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="672f23d1-408d-4b3e-9068-66faf28b06bb" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.696191 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.698209 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.698370 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.698928 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.699970 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.738471 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm"] Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.798301 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.798400 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttsxt\" (UniqueName: \"kubernetes.io/projected/eed9f005-df08-43c9-b8e8-cd334d777714-kube-api-access-ttsxt\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.798466 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.798564 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.900976 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.901074 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttsxt\" (UniqueName: \"kubernetes.io/projected/eed9f005-df08-43c9-b8e8-cd334d777714-kube-api-access-ttsxt\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.901118 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.901184 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.906425 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.906466 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.909237 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:39:54 crc kubenswrapper[4914]: I0130 21:39:54.920450 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttsxt\" (UniqueName: \"kubernetes.io/projected/eed9f005-df08-43c9-b8e8-cd334d777714-kube-api-access-ttsxt\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:39:55 crc kubenswrapper[4914]: I0130 21:39:55.052357 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:39:55 crc kubenswrapper[4914]: I0130 21:39:55.702218 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm"] Jan 30 21:39:55 crc kubenswrapper[4914]: I0130 21:39:55.707893 4914 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:39:56 crc kubenswrapper[4914]: I0130 21:39:56.635399 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" event={"ID":"eed9f005-df08-43c9-b8e8-cd334d777714","Type":"ContainerStarted","Data":"2a319d8ec27bccb7d2d37c5b69209dcfad8e59f86dd18f877ef52c837ddadba9"} Jan 30 21:39:57 crc kubenswrapper[4914]: I0130 21:39:57.646478 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" event={"ID":"eed9f005-df08-43c9-b8e8-cd334d777714","Type":"ContainerStarted","Data":"160b40641140c1c770c15d2911cf970fdc35d04cfa398a0150649a7bea7c662d"} Jan 30 21:39:57 crc kubenswrapper[4914]: I0130 21:39:57.669492 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" podStartSLOduration=2.747690924 podStartE2EDuration="3.669462937s" podCreationTimestamp="2026-01-30 21:39:54 +0000 UTC" firstStartedPulling="2026-01-30 21:39:55.707597538 +0000 UTC m=+1529.146234299" lastFinishedPulling="2026-01-30 21:39:56.629369531 +0000 UTC m=+1530.068006312" observedRunningTime="2026-01-30 21:39:57.661585694 +0000 UTC m=+1531.100222455" watchObservedRunningTime="2026-01-30 21:39:57.669462937 +0000 UTC m=+1531.108099698" Jan 30 21:39:59 crc kubenswrapper[4914]: I0130 21:39:59.440616 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nshqj"] Jan 30 21:39:59 crc kubenswrapper[4914]: I0130 21:39:59.444071 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:39:59 crc kubenswrapper[4914]: I0130 21:39:59.451612 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nshqj"] Jan 30 21:39:59 crc kubenswrapper[4914]: I0130 21:39:59.616210 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqvdf\" (UniqueName: \"kubernetes.io/projected/44478904-55f8-450b-b06b-90dddd7fd16c-kube-api-access-cqvdf\") pod \"community-operators-nshqj\" (UID: \"44478904-55f8-450b-b06b-90dddd7fd16c\") " pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:39:59 crc kubenswrapper[4914]: I0130 21:39:59.616327 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44478904-55f8-450b-b06b-90dddd7fd16c-catalog-content\") pod \"community-operators-nshqj\" (UID: \"44478904-55f8-450b-b06b-90dddd7fd16c\") " pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:39:59 crc kubenswrapper[4914]: I0130 21:39:59.616377 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44478904-55f8-450b-b06b-90dddd7fd16c-utilities\") pod \"community-operators-nshqj\" (UID: \"44478904-55f8-450b-b06b-90dddd7fd16c\") " pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:39:59 crc kubenswrapper[4914]: I0130 21:39:59.720063 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44478904-55f8-450b-b06b-90dddd7fd16c-utilities\") pod \"community-operators-nshqj\" (UID: \"44478904-55f8-450b-b06b-90dddd7fd16c\") " pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:39:59 crc kubenswrapper[4914]: I0130 21:39:59.720533 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqvdf\" (UniqueName: \"kubernetes.io/projected/44478904-55f8-450b-b06b-90dddd7fd16c-kube-api-access-cqvdf\") pod \"community-operators-nshqj\" (UID: \"44478904-55f8-450b-b06b-90dddd7fd16c\") " pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:39:59 crc kubenswrapper[4914]: I0130 21:39:59.720823 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44478904-55f8-450b-b06b-90dddd7fd16c-utilities\") pod \"community-operators-nshqj\" (UID: \"44478904-55f8-450b-b06b-90dddd7fd16c\") " pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:39:59 crc kubenswrapper[4914]: I0130 21:39:59.721880 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44478904-55f8-450b-b06b-90dddd7fd16c-catalog-content\") pod \"community-operators-nshqj\" (UID: \"44478904-55f8-450b-b06b-90dddd7fd16c\") " pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:39:59 crc kubenswrapper[4914]: I0130 21:39:59.722366 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44478904-55f8-450b-b06b-90dddd7fd16c-catalog-content\") pod \"community-operators-nshqj\" (UID: \"44478904-55f8-450b-b06b-90dddd7fd16c\") " pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:39:59 crc kubenswrapper[4914]: I0130 21:39:59.752796 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqvdf\" (UniqueName: \"kubernetes.io/projected/44478904-55f8-450b-b06b-90dddd7fd16c-kube-api-access-cqvdf\") pod \"community-operators-nshqj\" (UID: \"44478904-55f8-450b-b06b-90dddd7fd16c\") " pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:39:59 crc kubenswrapper[4914]: I0130 21:39:59.808761 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:40:00 crc kubenswrapper[4914]: W0130 21:40:00.323237 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44478904_55f8_450b_b06b_90dddd7fd16c.slice/crio-f9beb652d27164d5cc2c0260422986b9a9b5dbc64eeb99b8e73f5c0117aa7502 WatchSource:0}: Error finding container f9beb652d27164d5cc2c0260422986b9a9b5dbc64eeb99b8e73f5c0117aa7502: Status 404 returned error can't find the container with id f9beb652d27164d5cc2c0260422986b9a9b5dbc64eeb99b8e73f5c0117aa7502 Jan 30 21:40:00 crc kubenswrapper[4914]: I0130 21:40:00.325584 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nshqj"] Jan 30 21:40:00 crc kubenswrapper[4914]: I0130 21:40:00.693741 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nshqj" event={"ID":"44478904-55f8-450b-b06b-90dddd7fd16c","Type":"ContainerDied","Data":"8c1dac39638e0a0f6c7060838587a4243e75c18b17286e4cfc3dd6ed2af5e042"} Jan 30 21:40:00 crc kubenswrapper[4914]: I0130 21:40:00.693686 4914 generic.go:334] "Generic (PLEG): container finished" podID="44478904-55f8-450b-b06b-90dddd7fd16c" containerID="8c1dac39638e0a0f6c7060838587a4243e75c18b17286e4cfc3dd6ed2af5e042" exitCode=0 Jan 30 21:40:00 crc kubenswrapper[4914]: I0130 21:40:00.694052 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nshqj" event={"ID":"44478904-55f8-450b-b06b-90dddd7fd16c","Type":"ContainerStarted","Data":"f9beb652d27164d5cc2c0260422986b9a9b5dbc64eeb99b8e73f5c0117aa7502"} Jan 30 21:40:01 crc kubenswrapper[4914]: I0130 21:40:01.710205 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nshqj" event={"ID":"44478904-55f8-450b-b06b-90dddd7fd16c","Type":"ContainerStarted","Data":"6ec85858870d810a44bf213b3bdf3ef15a3f05121fd14148f8399a585262143d"} Jan 30 21:40:03 crc kubenswrapper[4914]: I0130 21:40:03.735684 4914 generic.go:334] "Generic (PLEG): container finished" podID="44478904-55f8-450b-b06b-90dddd7fd16c" containerID="6ec85858870d810a44bf213b3bdf3ef15a3f05121fd14148f8399a585262143d" exitCode=0 Jan 30 21:40:03 crc kubenswrapper[4914]: I0130 21:40:03.735743 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nshqj" event={"ID":"44478904-55f8-450b-b06b-90dddd7fd16c","Type":"ContainerDied","Data":"6ec85858870d810a44bf213b3bdf3ef15a3f05121fd14148f8399a585262143d"} Jan 30 21:40:04 crc kubenswrapper[4914]: I0130 21:40:04.752156 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nshqj" event={"ID":"44478904-55f8-450b-b06b-90dddd7fd16c","Type":"ContainerStarted","Data":"7ad40b840709eebbac538833066c8f2b96306f23f2833fbafaf064f3908001e9"} Jan 30 21:40:04 crc kubenswrapper[4914]: I0130 21:40:04.788980 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nshqj" podStartSLOduration=2.298172787 podStartE2EDuration="5.788953964s" podCreationTimestamp="2026-01-30 21:39:59 +0000 UTC" firstStartedPulling="2026-01-30 21:40:00.699295645 +0000 UTC m=+1534.137932406" lastFinishedPulling="2026-01-30 21:40:04.190076822 +0000 UTC m=+1537.628713583" observedRunningTime="2026-01-30 21:40:04.771358024 +0000 UTC m=+1538.209994795" watchObservedRunningTime="2026-01-30 21:40:04.788953964 +0000 UTC m=+1538.227590765" Jan 30 21:40:09 crc kubenswrapper[4914]: I0130 21:40:09.809409 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:40:09 crc kubenswrapper[4914]: I0130 21:40:09.809936 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:40:09 crc kubenswrapper[4914]: I0130 21:40:09.858126 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:40:10 crc kubenswrapper[4914]: I0130 21:40:10.866188 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:40:10 crc kubenswrapper[4914]: I0130 21:40:10.922941 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nshqj"] Jan 30 21:40:12 crc kubenswrapper[4914]: I0130 21:40:12.834078 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nshqj" podUID="44478904-55f8-450b-b06b-90dddd7fd16c" containerName="registry-server" containerID="cri-o://7ad40b840709eebbac538833066c8f2b96306f23f2833fbafaf064f3908001e9" gracePeriod=2 Jan 30 21:40:13 crc kubenswrapper[4914]: I0130 21:40:13.845924 4914 generic.go:334] "Generic (PLEG): container finished" podID="44478904-55f8-450b-b06b-90dddd7fd16c" containerID="7ad40b840709eebbac538833066c8f2b96306f23f2833fbafaf064f3908001e9" exitCode=0 Jan 30 21:40:13 crc kubenswrapper[4914]: I0130 21:40:13.846148 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nshqj" event={"ID":"44478904-55f8-450b-b06b-90dddd7fd16c","Type":"ContainerDied","Data":"7ad40b840709eebbac538833066c8f2b96306f23f2833fbafaf064f3908001e9"} Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.167670 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.336878 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44478904-55f8-450b-b06b-90dddd7fd16c-utilities\") pod \"44478904-55f8-450b-b06b-90dddd7fd16c\" (UID: \"44478904-55f8-450b-b06b-90dddd7fd16c\") " Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.338238 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44478904-55f8-450b-b06b-90dddd7fd16c-utilities" (OuterVolumeSpecName: "utilities") pod "44478904-55f8-450b-b06b-90dddd7fd16c" (UID: "44478904-55f8-450b-b06b-90dddd7fd16c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.338157 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44478904-55f8-450b-b06b-90dddd7fd16c-catalog-content\") pod \"44478904-55f8-450b-b06b-90dddd7fd16c\" (UID: \"44478904-55f8-450b-b06b-90dddd7fd16c\") " Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.338735 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqvdf\" (UniqueName: \"kubernetes.io/projected/44478904-55f8-450b-b06b-90dddd7fd16c-kube-api-access-cqvdf\") pod \"44478904-55f8-450b-b06b-90dddd7fd16c\" (UID: \"44478904-55f8-450b-b06b-90dddd7fd16c\") " Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.339586 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44478904-55f8-450b-b06b-90dddd7fd16c-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.343895 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44478904-55f8-450b-b06b-90dddd7fd16c-kube-api-access-cqvdf" (OuterVolumeSpecName: "kube-api-access-cqvdf") pod "44478904-55f8-450b-b06b-90dddd7fd16c" (UID: "44478904-55f8-450b-b06b-90dddd7fd16c"). InnerVolumeSpecName "kube-api-access-cqvdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.391659 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44478904-55f8-450b-b06b-90dddd7fd16c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44478904-55f8-450b-b06b-90dddd7fd16c" (UID: "44478904-55f8-450b-b06b-90dddd7fd16c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.442092 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44478904-55f8-450b-b06b-90dddd7fd16c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.442134 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqvdf\" (UniqueName: \"kubernetes.io/projected/44478904-55f8-450b-b06b-90dddd7fd16c-kube-api-access-cqvdf\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.858819 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nshqj" event={"ID":"44478904-55f8-450b-b06b-90dddd7fd16c","Type":"ContainerDied","Data":"f9beb652d27164d5cc2c0260422986b9a9b5dbc64eeb99b8e73f5c0117aa7502"} Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.858883 4914 scope.go:117] "RemoveContainer" containerID="7ad40b840709eebbac538833066c8f2b96306f23f2833fbafaf064f3908001e9" Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.858903 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nshqj" Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.899647 4914 scope.go:117] "RemoveContainer" containerID="6ec85858870d810a44bf213b3bdf3ef15a3f05121fd14148f8399a585262143d" Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.909591 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nshqj"] Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.924944 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nshqj"] Jan 30 21:40:14 crc kubenswrapper[4914]: I0130 21:40:14.945793 4914 scope.go:117] "RemoveContainer" containerID="8c1dac39638e0a0f6c7060838587a4243e75c18b17286e4cfc3dd6ed2af5e042" Jan 30 21:40:15 crc kubenswrapper[4914]: I0130 21:40:15.832215 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44478904-55f8-450b-b06b-90dddd7fd16c" path="/var/lib/kubelet/pods/44478904-55f8-450b-b06b-90dddd7fd16c/volumes" Jan 30 21:40:38 crc kubenswrapper[4914]: I0130 21:40:38.878204 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t77s9"] Jan 30 21:40:38 crc kubenswrapper[4914]: E0130 21:40:38.879314 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44478904-55f8-450b-b06b-90dddd7fd16c" containerName="registry-server" Jan 30 21:40:38 crc kubenswrapper[4914]: I0130 21:40:38.879329 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="44478904-55f8-450b-b06b-90dddd7fd16c" containerName="registry-server" Jan 30 21:40:38 crc kubenswrapper[4914]: E0130 21:40:38.879347 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44478904-55f8-450b-b06b-90dddd7fd16c" containerName="extract-content" Jan 30 21:40:38 crc kubenswrapper[4914]: I0130 21:40:38.879353 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="44478904-55f8-450b-b06b-90dddd7fd16c" containerName="extract-content" Jan 30 21:40:38 crc kubenswrapper[4914]: E0130 21:40:38.879363 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44478904-55f8-450b-b06b-90dddd7fd16c" containerName="extract-utilities" Jan 30 21:40:38 crc kubenswrapper[4914]: I0130 21:40:38.879370 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="44478904-55f8-450b-b06b-90dddd7fd16c" containerName="extract-utilities" Jan 30 21:40:38 crc kubenswrapper[4914]: I0130 21:40:38.879612 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="44478904-55f8-450b-b06b-90dddd7fd16c" containerName="registry-server" Jan 30 21:40:38 crc kubenswrapper[4914]: I0130 21:40:38.881333 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:38 crc kubenswrapper[4914]: I0130 21:40:38.903228 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t77s9"] Jan 30 21:40:38 crc kubenswrapper[4914]: I0130 21:40:38.999150 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppqz5\" (UniqueName: \"kubernetes.io/projected/95ad6b81-2ff0-4654-8dfe-f1919072635f-kube-api-access-ppqz5\") pod \"redhat-marketplace-t77s9\" (UID: \"95ad6b81-2ff0-4654-8dfe-f1919072635f\") " pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:38 crc kubenswrapper[4914]: I0130 21:40:38.999365 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ad6b81-2ff0-4654-8dfe-f1919072635f-utilities\") pod \"redhat-marketplace-t77s9\" (UID: \"95ad6b81-2ff0-4654-8dfe-f1919072635f\") " pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:38 crc kubenswrapper[4914]: I0130 21:40:38.999445 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ad6b81-2ff0-4654-8dfe-f1919072635f-catalog-content\") pod \"redhat-marketplace-t77s9\" (UID: \"95ad6b81-2ff0-4654-8dfe-f1919072635f\") " pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:39 crc kubenswrapper[4914]: I0130 21:40:39.101278 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ad6b81-2ff0-4654-8dfe-f1919072635f-utilities\") pod \"redhat-marketplace-t77s9\" (UID: \"95ad6b81-2ff0-4654-8dfe-f1919072635f\") " pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:39 crc kubenswrapper[4914]: I0130 21:40:39.101348 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ad6b81-2ff0-4654-8dfe-f1919072635f-catalog-content\") pod \"redhat-marketplace-t77s9\" (UID: \"95ad6b81-2ff0-4654-8dfe-f1919072635f\") " pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:39 crc kubenswrapper[4914]: I0130 21:40:39.101490 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppqz5\" (UniqueName: \"kubernetes.io/projected/95ad6b81-2ff0-4654-8dfe-f1919072635f-kube-api-access-ppqz5\") pod \"redhat-marketplace-t77s9\" (UID: \"95ad6b81-2ff0-4654-8dfe-f1919072635f\") " pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:39 crc kubenswrapper[4914]: I0130 21:40:39.101818 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ad6b81-2ff0-4654-8dfe-f1919072635f-utilities\") pod \"redhat-marketplace-t77s9\" (UID: \"95ad6b81-2ff0-4654-8dfe-f1919072635f\") " pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:39 crc kubenswrapper[4914]: I0130 21:40:39.101929 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ad6b81-2ff0-4654-8dfe-f1919072635f-catalog-content\") pod \"redhat-marketplace-t77s9\" (UID: \"95ad6b81-2ff0-4654-8dfe-f1919072635f\") " pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:39 crc kubenswrapper[4914]: I0130 21:40:39.124468 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppqz5\" (UniqueName: \"kubernetes.io/projected/95ad6b81-2ff0-4654-8dfe-f1919072635f-kube-api-access-ppqz5\") pod \"redhat-marketplace-t77s9\" (UID: \"95ad6b81-2ff0-4654-8dfe-f1919072635f\") " pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:39 crc kubenswrapper[4914]: I0130 21:40:39.209013 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:39 crc kubenswrapper[4914]: I0130 21:40:39.694740 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t77s9"] Jan 30 21:40:40 crc kubenswrapper[4914]: I0130 21:40:40.146310 4914 generic.go:334] "Generic (PLEG): container finished" podID="95ad6b81-2ff0-4654-8dfe-f1919072635f" containerID="5811e3725f48c6cf64819bc37ccca0f66a954928c7dd11d85fa08536780cbc5f" exitCode=0 Jan 30 21:40:40 crc kubenswrapper[4914]: I0130 21:40:40.146531 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t77s9" event={"ID":"95ad6b81-2ff0-4654-8dfe-f1919072635f","Type":"ContainerDied","Data":"5811e3725f48c6cf64819bc37ccca0f66a954928c7dd11d85fa08536780cbc5f"} Jan 30 21:40:40 crc kubenswrapper[4914]: I0130 21:40:40.147747 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t77s9" event={"ID":"95ad6b81-2ff0-4654-8dfe-f1919072635f","Type":"ContainerStarted","Data":"2b71bca034b47e5ca6e32d8dd8ff8299afb1f7562163eab5506de67fa9eba015"} Jan 30 21:40:41 crc kubenswrapper[4914]: I0130 21:40:41.509120 4914 scope.go:117] "RemoveContainer" containerID="4f6a107943ba05d7abfd1d86db3620d357bbd123074c5780835a11f8d8596eee" Jan 30 21:40:42 crc kubenswrapper[4914]: I0130 21:40:42.169619 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t77s9" event={"ID":"95ad6b81-2ff0-4654-8dfe-f1919072635f","Type":"ContainerStarted","Data":"4a4a84e806203880901e5c03cee34a9f79a82289f4be2f67120be6acf4e5a532"} Jan 30 21:40:43 crc kubenswrapper[4914]: I0130 21:40:43.180370 4914 generic.go:334] "Generic (PLEG): container finished" podID="95ad6b81-2ff0-4654-8dfe-f1919072635f" containerID="4a4a84e806203880901e5c03cee34a9f79a82289f4be2f67120be6acf4e5a532" exitCode=0 Jan 30 21:40:43 crc kubenswrapper[4914]: I0130 21:40:43.180464 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t77s9" event={"ID":"95ad6b81-2ff0-4654-8dfe-f1919072635f","Type":"ContainerDied","Data":"4a4a84e806203880901e5c03cee34a9f79a82289f4be2f67120be6acf4e5a532"} Jan 30 21:40:45 crc kubenswrapper[4914]: I0130 21:40:45.205660 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t77s9" event={"ID":"95ad6b81-2ff0-4654-8dfe-f1919072635f","Type":"ContainerStarted","Data":"0a63a125876c7be7ca716b833bad8bf336692d32064e25f4ac49d3a7c7e41475"} Jan 30 21:40:45 crc kubenswrapper[4914]: I0130 21:40:45.230967 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t77s9" podStartSLOduration=3.227383519 podStartE2EDuration="7.230946181s" podCreationTimestamp="2026-01-30 21:40:38 +0000 UTC" firstStartedPulling="2026-01-30 21:40:40.14799308 +0000 UTC m=+1573.586629841" lastFinishedPulling="2026-01-30 21:40:44.151555742 +0000 UTC m=+1577.590192503" observedRunningTime="2026-01-30 21:40:45.228232894 +0000 UTC m=+1578.666869645" watchObservedRunningTime="2026-01-30 21:40:45.230946181 +0000 UTC m=+1578.669582942" Jan 30 21:40:49 crc kubenswrapper[4914]: I0130 21:40:49.210363 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:49 crc kubenswrapper[4914]: I0130 21:40:49.211019 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:49 crc kubenswrapper[4914]: I0130 21:40:49.262837 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:49 crc kubenswrapper[4914]: I0130 21:40:49.337368 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:49 crc kubenswrapper[4914]: I0130 21:40:49.521451 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t77s9"] Jan 30 21:40:51 crc kubenswrapper[4914]: I0130 21:40:51.270462 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t77s9" podUID="95ad6b81-2ff0-4654-8dfe-f1919072635f" containerName="registry-server" containerID="cri-o://0a63a125876c7be7ca716b833bad8bf336692d32064e25f4ac49d3a7c7e41475" gracePeriod=2 Jan 30 21:40:51 crc kubenswrapper[4914]: I0130 21:40:51.909635 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:51 crc kubenswrapper[4914]: I0130 21:40:51.976343 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ad6b81-2ff0-4654-8dfe-f1919072635f-utilities\") pod \"95ad6b81-2ff0-4654-8dfe-f1919072635f\" (UID: \"95ad6b81-2ff0-4654-8dfe-f1919072635f\") " Jan 30 21:40:51 crc kubenswrapper[4914]: I0130 21:40:51.976568 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppqz5\" (UniqueName: \"kubernetes.io/projected/95ad6b81-2ff0-4654-8dfe-f1919072635f-kube-api-access-ppqz5\") pod \"95ad6b81-2ff0-4654-8dfe-f1919072635f\" (UID: \"95ad6b81-2ff0-4654-8dfe-f1919072635f\") " Jan 30 21:40:51 crc kubenswrapper[4914]: I0130 21:40:51.976804 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ad6b81-2ff0-4654-8dfe-f1919072635f-catalog-content\") pod \"95ad6b81-2ff0-4654-8dfe-f1919072635f\" (UID: \"95ad6b81-2ff0-4654-8dfe-f1919072635f\") " Jan 30 21:40:51 crc kubenswrapper[4914]: I0130 21:40:51.977358 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95ad6b81-2ff0-4654-8dfe-f1919072635f-utilities" (OuterVolumeSpecName: "utilities") pod "95ad6b81-2ff0-4654-8dfe-f1919072635f" (UID: "95ad6b81-2ff0-4654-8dfe-f1919072635f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:40:51 crc kubenswrapper[4914]: I0130 21:40:51.990105 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ad6b81-2ff0-4654-8dfe-f1919072635f-kube-api-access-ppqz5" (OuterVolumeSpecName: "kube-api-access-ppqz5") pod "95ad6b81-2ff0-4654-8dfe-f1919072635f" (UID: "95ad6b81-2ff0-4654-8dfe-f1919072635f"). InnerVolumeSpecName "kube-api-access-ppqz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.002899 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95ad6b81-2ff0-4654-8dfe-f1919072635f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95ad6b81-2ff0-4654-8dfe-f1919072635f" (UID: "95ad6b81-2ff0-4654-8dfe-f1919072635f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.079650 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppqz5\" (UniqueName: \"kubernetes.io/projected/95ad6b81-2ff0-4654-8dfe-f1919072635f-kube-api-access-ppqz5\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.079679 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ad6b81-2ff0-4654-8dfe-f1919072635f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.079688 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ad6b81-2ff0-4654-8dfe-f1919072635f-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.284970 4914 generic.go:334] "Generic (PLEG): container finished" podID="95ad6b81-2ff0-4654-8dfe-f1919072635f" containerID="0a63a125876c7be7ca716b833bad8bf336692d32064e25f4ac49d3a7c7e41475" exitCode=0 Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.285023 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t77s9" event={"ID":"95ad6b81-2ff0-4654-8dfe-f1919072635f","Type":"ContainerDied","Data":"0a63a125876c7be7ca716b833bad8bf336692d32064e25f4ac49d3a7c7e41475"} Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.285053 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t77s9" Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.285089 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t77s9" event={"ID":"95ad6b81-2ff0-4654-8dfe-f1919072635f","Type":"ContainerDied","Data":"2b71bca034b47e5ca6e32d8dd8ff8299afb1f7562163eab5506de67fa9eba015"} Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.285114 4914 scope.go:117] "RemoveContainer" containerID="0a63a125876c7be7ca716b833bad8bf336692d32064e25f4ac49d3a7c7e41475" Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.341165 4914 scope.go:117] "RemoveContainer" containerID="4a4a84e806203880901e5c03cee34a9f79a82289f4be2f67120be6acf4e5a532" Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.341382 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t77s9"] Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.353269 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t77s9"] Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.366120 4914 scope.go:117] "RemoveContainer" containerID="5811e3725f48c6cf64819bc37ccca0f66a954928c7dd11d85fa08536780cbc5f" Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.422786 4914 scope.go:117] "RemoveContainer" containerID="0a63a125876c7be7ca716b833bad8bf336692d32064e25f4ac49d3a7c7e41475" Jan 30 21:40:52 crc kubenswrapper[4914]: E0130 21:40:52.423612 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a63a125876c7be7ca716b833bad8bf336692d32064e25f4ac49d3a7c7e41475\": container with ID starting with 0a63a125876c7be7ca716b833bad8bf336692d32064e25f4ac49d3a7c7e41475 not found: ID does not exist" containerID="0a63a125876c7be7ca716b833bad8bf336692d32064e25f4ac49d3a7c7e41475" Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.423649 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a63a125876c7be7ca716b833bad8bf336692d32064e25f4ac49d3a7c7e41475"} err="failed to get container status \"0a63a125876c7be7ca716b833bad8bf336692d32064e25f4ac49d3a7c7e41475\": rpc error: code = NotFound desc = could not find container \"0a63a125876c7be7ca716b833bad8bf336692d32064e25f4ac49d3a7c7e41475\": container with ID starting with 0a63a125876c7be7ca716b833bad8bf336692d32064e25f4ac49d3a7c7e41475 not found: ID does not exist" Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.423678 4914 scope.go:117] "RemoveContainer" containerID="4a4a84e806203880901e5c03cee34a9f79a82289f4be2f67120be6acf4e5a532" Jan 30 21:40:52 crc kubenswrapper[4914]: E0130 21:40:52.424083 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a4a84e806203880901e5c03cee34a9f79a82289f4be2f67120be6acf4e5a532\": container with ID starting with 4a4a84e806203880901e5c03cee34a9f79a82289f4be2f67120be6acf4e5a532 not found: ID does not exist" containerID="4a4a84e806203880901e5c03cee34a9f79a82289f4be2f67120be6acf4e5a532" Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.424114 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a4a84e806203880901e5c03cee34a9f79a82289f4be2f67120be6acf4e5a532"} err="failed to get container status \"4a4a84e806203880901e5c03cee34a9f79a82289f4be2f67120be6acf4e5a532\": rpc error: code = NotFound desc = could not find container \"4a4a84e806203880901e5c03cee34a9f79a82289f4be2f67120be6acf4e5a532\": container with ID starting with 4a4a84e806203880901e5c03cee34a9f79a82289f4be2f67120be6acf4e5a532 not found: ID does not exist" Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.424137 4914 scope.go:117] "RemoveContainer" containerID="5811e3725f48c6cf64819bc37ccca0f66a954928c7dd11d85fa08536780cbc5f" Jan 30 21:40:52 crc kubenswrapper[4914]: E0130 21:40:52.424586 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5811e3725f48c6cf64819bc37ccca0f66a954928c7dd11d85fa08536780cbc5f\": container with ID starting with 5811e3725f48c6cf64819bc37ccca0f66a954928c7dd11d85fa08536780cbc5f not found: ID does not exist" containerID="5811e3725f48c6cf64819bc37ccca0f66a954928c7dd11d85fa08536780cbc5f" Jan 30 21:40:52 crc kubenswrapper[4914]: I0130 21:40:52.424646 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5811e3725f48c6cf64819bc37ccca0f66a954928c7dd11d85fa08536780cbc5f"} err="failed to get container status \"5811e3725f48c6cf64819bc37ccca0f66a954928c7dd11d85fa08536780cbc5f\": rpc error: code = NotFound desc = could not find container \"5811e3725f48c6cf64819bc37ccca0f66a954928c7dd11d85fa08536780cbc5f\": container with ID starting with 5811e3725f48c6cf64819bc37ccca0f66a954928c7dd11d85fa08536780cbc5f not found: ID does not exist" Jan 30 21:40:53 crc kubenswrapper[4914]: I0130 21:40:53.833829 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95ad6b81-2ff0-4654-8dfe-f1919072635f" path="/var/lib/kubelet/pods/95ad6b81-2ff0-4654-8dfe-f1919072635f/volumes" Jan 30 21:41:26 crc kubenswrapper[4914]: I0130 21:41:26.983310 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:41:26 crc kubenswrapper[4914]: I0130 21:41:26.983831 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:41:41 crc kubenswrapper[4914]: I0130 21:41:41.614295 4914 scope.go:117] "RemoveContainer" containerID="88f7ac1c0cdd9115908efb204c089ddf0248133f9f26ab03074090a884bf99c1" Jan 30 21:41:41 crc kubenswrapper[4914]: I0130 21:41:41.642066 4914 scope.go:117] "RemoveContainer" containerID="7226b109ee0610df021d7b028718ba70f315cd24195d3bf0b5a1738ea52be53b" Jan 30 21:41:41 crc kubenswrapper[4914]: I0130 21:41:41.696909 4914 scope.go:117] "RemoveContainer" containerID="1cb7e0c8338c4e43b8e92ecacd04c310d38d3d5f776f226e015397c968c01590" Jan 30 21:41:41 crc kubenswrapper[4914]: I0130 21:41:41.735010 4914 scope.go:117] "RemoveContainer" containerID="ec1355b9b35d302affb5d502116cf5e28958ca83204218cbc0d3271ccc0855f3" Jan 30 21:41:56 crc kubenswrapper[4914]: I0130 21:41:56.983487 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:41:56 crc kubenswrapper[4914]: I0130 21:41:56.983999 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:42:26 crc kubenswrapper[4914]: I0130 21:42:26.983090 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:42:26 crc kubenswrapper[4914]: I0130 21:42:26.983618 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:42:26 crc kubenswrapper[4914]: I0130 21:42:26.983660 4914 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:42:26 crc kubenswrapper[4914]: I0130 21:42:26.984397 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56"} pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:42:26 crc kubenswrapper[4914]: I0130 21:42:26.984446 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" containerID="cri-o://1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" gracePeriod=600 Jan 30 21:42:27 crc kubenswrapper[4914]: E0130 21:42:27.155741 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:42:27 crc kubenswrapper[4914]: I0130 21:42:27.364920 4914 generic.go:334] "Generic (PLEG): container finished" podID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" exitCode=0 Jan 30 21:42:27 crc kubenswrapper[4914]: I0130 21:42:27.364966 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerDied","Data":"1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56"} Jan 30 21:42:27 crc kubenswrapper[4914]: I0130 21:42:27.365000 4914 scope.go:117] "RemoveContainer" containerID="018dff8f009112f2d13f034fc24ae6b87f418ea17a0bfaeb82d8fef0d185a5d1" Jan 30 21:42:27 crc kubenswrapper[4914]: I0130 21:42:27.365836 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:42:27 crc kubenswrapper[4914]: E0130 21:42:27.366185 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:42:37 crc kubenswrapper[4914]: I0130 21:42:37.825224 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:42:37 crc kubenswrapper[4914]: E0130 21:42:37.826195 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:42:49 crc kubenswrapper[4914]: I0130 21:42:49.818000 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:42:49 crc kubenswrapper[4914]: E0130 21:42:49.818876 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:42:53 crc kubenswrapper[4914]: I0130 21:42:53.652143 4914 generic.go:334] "Generic (PLEG): container finished" podID="eed9f005-df08-43c9-b8e8-cd334d777714" containerID="160b40641140c1c770c15d2911cf970fdc35d04cfa398a0150649a7bea7c662d" exitCode=0 Jan 30 21:42:53 crc kubenswrapper[4914]: I0130 21:42:53.652223 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" event={"ID":"eed9f005-df08-43c9-b8e8-cd334d777714","Type":"ContainerDied","Data":"160b40641140c1c770c15d2911cf970fdc35d04cfa398a0150649a7bea7c662d"} Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.235373 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.297085 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-inventory\") pod \"eed9f005-df08-43c9-b8e8-cd334d777714\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.297169 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttsxt\" (UniqueName: \"kubernetes.io/projected/eed9f005-df08-43c9-b8e8-cd334d777714-kube-api-access-ttsxt\") pod \"eed9f005-df08-43c9-b8e8-cd334d777714\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.297249 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-ssh-key-openstack-edpm-ipam\") pod \"eed9f005-df08-43c9-b8e8-cd334d777714\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.297318 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-bootstrap-combined-ca-bundle\") pod \"eed9f005-df08-43c9-b8e8-cd334d777714\" (UID: \"eed9f005-df08-43c9-b8e8-cd334d777714\") " Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.308594 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eed9f005-df08-43c9-b8e8-cd334d777714-kube-api-access-ttsxt" (OuterVolumeSpecName: "kube-api-access-ttsxt") pod "eed9f005-df08-43c9-b8e8-cd334d777714" (UID: "eed9f005-df08-43c9-b8e8-cd334d777714"). InnerVolumeSpecName "kube-api-access-ttsxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.313799 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "eed9f005-df08-43c9-b8e8-cd334d777714" (UID: "eed9f005-df08-43c9-b8e8-cd334d777714"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.334653 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-inventory" (OuterVolumeSpecName: "inventory") pod "eed9f005-df08-43c9-b8e8-cd334d777714" (UID: "eed9f005-df08-43c9-b8e8-cd334d777714"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.341613 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eed9f005-df08-43c9-b8e8-cd334d777714" (UID: "eed9f005-df08-43c9-b8e8-cd334d777714"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.399827 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.399866 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttsxt\" (UniqueName: \"kubernetes.io/projected/eed9f005-df08-43c9-b8e8-cd334d777714-kube-api-access-ttsxt\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.399879 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.399890 4914 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9f005-df08-43c9-b8e8-cd334d777714-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.669827 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" event={"ID":"eed9f005-df08-43c9-b8e8-cd334d777714","Type":"ContainerDied","Data":"2a319d8ec27bccb7d2d37c5b69209dcfad8e59f86dd18f877ef52c837ddadba9"} Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.669868 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a319d8ec27bccb7d2d37c5b69209dcfad8e59f86dd18f877ef52c837ddadba9" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.669916 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.773164 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4"] Jan 30 21:42:55 crc kubenswrapper[4914]: E0130 21:42:55.773658 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ad6b81-2ff0-4654-8dfe-f1919072635f" containerName="extract-content" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.773678 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ad6b81-2ff0-4654-8dfe-f1919072635f" containerName="extract-content" Jan 30 21:42:55 crc kubenswrapper[4914]: E0130 21:42:55.773726 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eed9f005-df08-43c9-b8e8-cd334d777714" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.773738 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed9f005-df08-43c9-b8e8-cd334d777714" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 30 21:42:55 crc kubenswrapper[4914]: E0130 21:42:55.773754 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ad6b81-2ff0-4654-8dfe-f1919072635f" containerName="registry-server" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.773761 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ad6b81-2ff0-4654-8dfe-f1919072635f" containerName="registry-server" Jan 30 21:42:55 crc kubenswrapper[4914]: E0130 21:42:55.773781 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ad6b81-2ff0-4654-8dfe-f1919072635f" containerName="extract-utilities" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.773789 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ad6b81-2ff0-4654-8dfe-f1919072635f" containerName="extract-utilities" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.774084 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ad6b81-2ff0-4654-8dfe-f1919072635f" containerName="registry-server" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.774102 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="eed9f005-df08-43c9-b8e8-cd334d777714" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.775030 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.778251 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.778371 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.778379 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.778382 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.784767 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4"] Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.909853 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bczr4\" (UID: \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.910015 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwld5\" (UniqueName: \"kubernetes.io/projected/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-kube-api-access-fwld5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bczr4\" (UID: \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" Jan 30 21:42:55 crc kubenswrapper[4914]: I0130 21:42:55.910117 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bczr4\" (UID: \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" Jan 30 21:42:56 crc kubenswrapper[4914]: I0130 21:42:56.011868 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bczr4\" (UID: \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" Jan 30 21:42:56 crc kubenswrapper[4914]: I0130 21:42:56.012235 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwld5\" (UniqueName: \"kubernetes.io/projected/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-kube-api-access-fwld5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bczr4\" (UID: \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" Jan 30 21:42:56 crc kubenswrapper[4914]: I0130 21:42:56.012302 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bczr4\" (UID: \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" Jan 30 21:42:56 crc kubenswrapper[4914]: I0130 21:42:56.016743 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bczr4\" (UID: \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" Jan 30 21:42:56 crc kubenswrapper[4914]: I0130 21:42:56.017098 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bczr4\" (UID: \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" Jan 30 21:42:56 crc kubenswrapper[4914]: I0130 21:42:56.031258 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwld5\" (UniqueName: \"kubernetes.io/projected/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-kube-api-access-fwld5\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bczr4\" (UID: \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" Jan 30 21:42:56 crc kubenswrapper[4914]: I0130 21:42:56.093249 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" Jan 30 21:42:56 crc kubenswrapper[4914]: I0130 21:42:56.635604 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4"] Jan 30 21:42:56 crc kubenswrapper[4914]: I0130 21:42:56.680550 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" event={"ID":"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7","Type":"ContainerStarted","Data":"91b0d5321b8a3768b8d78fc1d5326710c3bf2fc67066d3d0433094abaa8fd865"} Jan 30 21:42:57 crc kubenswrapper[4914]: I0130 21:42:57.694873 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" event={"ID":"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7","Type":"ContainerStarted","Data":"0bc5ffd465b540e92012f6c2a817c1364c07965d1478006726556c7ba9ecdbab"} Jan 30 21:42:57 crc kubenswrapper[4914]: I0130 21:42:57.718268 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" podStartSLOduration=2.236863639 podStartE2EDuration="2.718248089s" podCreationTimestamp="2026-01-30 21:42:55 +0000 UTC" firstStartedPulling="2026-01-30 21:42:56.647048382 +0000 UTC m=+1710.085685143" lastFinishedPulling="2026-01-30 21:42:57.128432832 +0000 UTC m=+1710.567069593" observedRunningTime="2026-01-30 21:42:57.710389247 +0000 UTC m=+1711.149026008" watchObservedRunningTime="2026-01-30 21:42:57.718248089 +0000 UTC m=+1711.156884850" Jan 30 21:43:02 crc kubenswrapper[4914]: I0130 21:43:02.055988 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-px97v"] Jan 30 21:43:02 crc kubenswrapper[4914]: I0130 21:43:02.067887 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-dd72-account-create-update-dltrr"] Jan 30 21:43:02 crc kubenswrapper[4914]: I0130 21:43:02.081861 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-px97v"] Jan 30 21:43:02 crc kubenswrapper[4914]: I0130 21:43:02.093686 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-dd72-account-create-update-dltrr"] Jan 30 21:43:03 crc kubenswrapper[4914]: I0130 21:43:03.818625 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:43:03 crc kubenswrapper[4914]: E0130 21:43:03.819409 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:43:03 crc kubenswrapper[4914]: I0130 21:43:03.834626 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87a04c4b-a588-403c-b989-d1eb41d4cd13" path="/var/lib/kubelet/pods/87a04c4b-a588-403c-b989-d1eb41d4cd13/volumes" Jan 30 21:43:03 crc kubenswrapper[4914]: I0130 21:43:03.836154 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0e095e-5ba2-4450-9006-3d471fd30225" path="/var/lib/kubelet/pods/8f0e095e-5ba2-4450-9006-3d471fd30225/volumes" Jan 30 21:43:04 crc kubenswrapper[4914]: I0130 21:43:04.034886 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1945-account-create-update-crf4j"] Jan 30 21:43:04 crc kubenswrapper[4914]: I0130 21:43:04.049790 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-tvrvx"] Jan 30 21:43:04 crc kubenswrapper[4914]: I0130 21:43:04.061428 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7688-account-create-update-m7mwk"] Jan 30 21:43:04 crc kubenswrapper[4914]: I0130 21:43:04.070592 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jfgqq"] Jan 30 21:43:04 crc kubenswrapper[4914]: I0130 21:43:04.080810 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1945-account-create-update-crf4j"] Jan 30 21:43:04 crc kubenswrapper[4914]: I0130 21:43:04.088431 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-tvrvx"] Jan 30 21:43:04 crc kubenswrapper[4914]: I0130 21:43:04.096068 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7688-account-create-update-m7mwk"] Jan 30 21:43:04 crc kubenswrapper[4914]: I0130 21:43:04.111181 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jfgqq"] Jan 30 21:43:05 crc kubenswrapper[4914]: I0130 21:43:05.831318 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656" path="/var/lib/kubelet/pods/0c7a0f0f-5fc9-4fb3-ab9a-013f9621b656/volumes" Jan 30 21:43:05 crc kubenswrapper[4914]: I0130 21:43:05.834461 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed24889-cc02-4c4d-ba7f-7b397f1516ae" path="/var/lib/kubelet/pods/2ed24889-cc02-4c4d-ba7f-7b397f1516ae/volumes" Jan 30 21:43:05 crc kubenswrapper[4914]: I0130 21:43:05.835020 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d29c2c9-d6e0-4691-b53a-ebd2205fbba3" path="/var/lib/kubelet/pods/3d29c2c9-d6e0-4691-b53a-ebd2205fbba3/volumes" Jan 30 21:43:05 crc kubenswrapper[4914]: I0130 21:43:05.835556 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6d8a7cc-b7c4-408f-a50d-9abb436695d8" path="/var/lib/kubelet/pods/d6d8a7cc-b7c4-408f-a50d-9abb436695d8/volumes" Jan 30 21:43:14 crc kubenswrapper[4914]: I0130 21:43:14.818454 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:43:14 crc kubenswrapper[4914]: E0130 21:43:14.819453 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:43:21 crc kubenswrapper[4914]: I0130 21:43:21.049515 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-dqj2s"] Jan 30 21:43:21 crc kubenswrapper[4914]: I0130 21:43:21.064203 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-dqj2s"] Jan 30 21:43:21 crc kubenswrapper[4914]: I0130 21:43:21.833255 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df1cd5f-9cc1-4549-ba23-76c21805e479" path="/var/lib/kubelet/pods/7df1cd5f-9cc1-4549-ba23-76c21805e479/volumes" Jan 30 21:43:27 crc kubenswrapper[4914]: I0130 21:43:27.825079 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:43:27 crc kubenswrapper[4914]: E0130 21:43:27.827518 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:43:34 crc kubenswrapper[4914]: I0130 21:43:34.059540 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-665f-account-create-update-bgn68"] Jan 30 21:43:34 crc kubenswrapper[4914]: I0130 21:43:34.072566 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-tz86b"] Jan 30 21:43:34 crc kubenswrapper[4914]: I0130 21:43:34.094819 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3497-account-create-update-58wqv"] Jan 30 21:43:34 crc kubenswrapper[4914]: I0130 21:43:34.109061 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-tz86b"] Jan 30 21:43:34 crc kubenswrapper[4914]: I0130 21:43:34.122639 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-4158-account-create-update-hwrhs"] Jan 30 21:43:34 crc kubenswrapper[4914]: I0130 21:43:34.135134 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-4158-account-create-update-hwrhs"] Jan 30 21:43:34 crc kubenswrapper[4914]: I0130 21:43:34.147875 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3497-account-create-update-58wqv"] Jan 30 21:43:34 crc kubenswrapper[4914]: I0130 21:43:34.173081 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-665f-account-create-update-bgn68"] Jan 30 21:43:35 crc kubenswrapper[4914]: I0130 21:43:35.835737 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="001ceb92-59d0-496f-8331-51d4e131b419" path="/var/lib/kubelet/pods/001ceb92-59d0-496f-8331-51d4e131b419/volumes" Jan 30 21:43:35 crc kubenswrapper[4914]: I0130 21:43:35.836872 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="562c29cc-4430-4d60-9577-92ff7849353f" path="/var/lib/kubelet/pods/562c29cc-4430-4d60-9577-92ff7849353f/volumes" Jan 30 21:43:35 crc kubenswrapper[4914]: I0130 21:43:35.837546 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="580c0e5e-259e-4164-ac77-ca625b915ffa" path="/var/lib/kubelet/pods/580c0e5e-259e-4164-ac77-ca625b915ffa/volumes" Jan 30 21:43:35 crc kubenswrapper[4914]: I0130 21:43:35.838893 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6451aee5-930f-41e5-8a8c-8b50c3b3a887" path="/var/lib/kubelet/pods/6451aee5-930f-41e5-8a8c-8b50c3b3a887/volumes" Jan 30 21:43:38 crc kubenswrapper[4914]: I0130 21:43:38.040861 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-m7dsr"] Jan 30 21:43:38 crc kubenswrapper[4914]: I0130 21:43:38.056844 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-g6c92"] Jan 30 21:43:38 crc kubenswrapper[4914]: I0130 21:43:38.072542 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f2e8-account-create-update-tnvvs"] Jan 30 21:43:38 crc kubenswrapper[4914]: I0130 21:43:38.082009 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-8w2bj"] Jan 30 21:43:38 crc kubenswrapper[4914]: I0130 21:43:38.091163 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-g6c92"] Jan 30 21:43:38 crc kubenswrapper[4914]: I0130 21:43:38.115145 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-m7dsr"] Jan 30 21:43:38 crc kubenswrapper[4914]: I0130 21:43:38.127988 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f2e8-account-create-update-tnvvs"] Jan 30 21:43:38 crc kubenswrapper[4914]: I0130 21:43:38.138829 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-8w2bj"] Jan 30 21:43:39 crc kubenswrapper[4914]: I0130 21:43:39.836407 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="141de14b-d8ba-45b3-96de-efe388b9fc35" path="/var/lib/kubelet/pods/141de14b-d8ba-45b3-96de-efe388b9fc35/volumes" Jan 30 21:43:39 crc kubenswrapper[4914]: I0130 21:43:39.837570 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83c5fce8-6d2b-47ae-ba0d-fbbc545de111" path="/var/lib/kubelet/pods/83c5fce8-6d2b-47ae-ba0d-fbbc545de111/volumes" Jan 30 21:43:39 crc kubenswrapper[4914]: I0130 21:43:39.838272 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9cc7a16-5905-4608-9212-2440d7235a11" path="/var/lib/kubelet/pods/a9cc7a16-5905-4608-9212-2440d7235a11/volumes" Jan 30 21:43:39 crc kubenswrapper[4914]: I0130 21:43:39.838986 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb47ba3a-cc1d-48fb-9974-0d30deca719b" path="/var/lib/kubelet/pods/fb47ba3a-cc1d-48fb-9974-0d30deca719b/volumes" Jan 30 21:43:40 crc kubenswrapper[4914]: I0130 21:43:40.046740 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-lx7cf"] Jan 30 21:43:40 crc kubenswrapper[4914]: I0130 21:43:40.060065 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-lx7cf"] Jan 30 21:43:40 crc kubenswrapper[4914]: I0130 21:43:40.819111 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:43:40 crc kubenswrapper[4914]: E0130 21:43:40.819492 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:43:41 crc kubenswrapper[4914]: I0130 21:43:41.833241 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf598ffa-7a5d-4a7b-a547-cbf01cdefc25" path="/var/lib/kubelet/pods/cf598ffa-7a5d-4a7b-a547-cbf01cdefc25/volumes" Jan 30 21:43:41 crc kubenswrapper[4914]: I0130 21:43:41.944049 4914 scope.go:117] "RemoveContainer" containerID="c90eaa9a4b87197d0c33698c04e8ea796e9b104f4dd2da6447e0da468c1396e9" Jan 30 21:43:41 crc kubenswrapper[4914]: I0130 21:43:41.987507 4914 scope.go:117] "RemoveContainer" containerID="ca2f074c48f1a84505896a3ddbe39d8760969f813b16824be596fb08d08b9dff" Jan 30 21:43:42 crc kubenswrapper[4914]: I0130 21:43:42.045822 4914 scope.go:117] "RemoveContainer" containerID="6cf5a0d9b79786c8302d1183ee5ea12812d37a8aff3ec2c26ba49deebb94df85" Jan 30 21:43:42 crc kubenswrapper[4914]: I0130 21:43:42.112784 4914 scope.go:117] "RemoveContainer" containerID="9c23f1cb43b6c4e8d1730f0b6363775edd8078930f6ddfa0cb5a4ccbc11d4a1e" Jan 30 21:43:42 crc kubenswrapper[4914]: I0130 21:43:42.181265 4914 scope.go:117] "RemoveContainer" containerID="6b3aa84496d9457083f50d01b95c90e861e78b5b44e664adc23c6ee98b777844" Jan 30 21:43:42 crc kubenswrapper[4914]: I0130 21:43:42.224465 4914 scope.go:117] "RemoveContainer" containerID="0d5b5ff41743d65278e9fe8ff021a803ff8b8bbd1562fdbff225c5269d778b54" Jan 30 21:43:42 crc kubenswrapper[4914]: I0130 21:43:42.282879 4914 scope.go:117] "RemoveContainer" containerID="2ee04ef2d6c1149023df191f681a0176b24938c76eed9fe58e630af38f3ee8da" Jan 30 21:43:42 crc kubenswrapper[4914]: I0130 21:43:42.305863 4914 scope.go:117] "RemoveContainer" containerID="ba9ef4df2ae722ddcbcf9ebb778a911849eb3f6cda19b59d3fed87a61e483993" Jan 30 21:43:42 crc kubenswrapper[4914]: I0130 21:43:42.326262 4914 scope.go:117] "RemoveContainer" containerID="09f1e288be3ca6b19edd36434bc8983709206768ab1e07db2b5fd23334fb3cde" Jan 30 21:43:42 crc kubenswrapper[4914]: I0130 21:43:42.357402 4914 scope.go:117] "RemoveContainer" containerID="d8f73b59ed4e69f6877673635767b5659f77808db83cc3e43a655b8df8e3956c" Jan 30 21:43:42 crc kubenswrapper[4914]: I0130 21:43:42.381417 4914 scope.go:117] "RemoveContainer" containerID="ef8f74605cd99918fd80a790fad82d36171ec6bd3b5e9f0e93320ff1147f1b06" Jan 30 21:43:42 crc kubenswrapper[4914]: I0130 21:43:42.406999 4914 scope.go:117] "RemoveContainer" containerID="dca335e33d26033d2a549eec28d093f1e5329310df9ddac334aa3550e922e28d" Jan 30 21:43:42 crc kubenswrapper[4914]: I0130 21:43:42.439692 4914 scope.go:117] "RemoveContainer" containerID="651d13073f2a0535d794b564d524f0be0667c6cb530bc8564966ffd0f705236b" Jan 30 21:43:42 crc kubenswrapper[4914]: I0130 21:43:42.464979 4914 scope.go:117] "RemoveContainer" containerID="defaddb59a3666031661e46a1e73c0d1ec9c8c43cb0c6ad9646fc9721b995ba3" Jan 30 21:43:42 crc kubenswrapper[4914]: I0130 21:43:42.496774 4914 scope.go:117] "RemoveContainer" containerID="f2a09f8166e99d336494e7f0904a70e077388b90920bb1b1fe93e52739dd082c" Jan 30 21:43:42 crc kubenswrapper[4914]: I0130 21:43:42.521613 4914 scope.go:117] "RemoveContainer" containerID="11c7a966dafdee6a0789c632fb8ad3b29eeadbcb5982dcdbe340ca69caf078a4" Jan 30 21:43:45 crc kubenswrapper[4914]: I0130 21:43:45.032688 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-k9mzs"] Jan 30 21:43:45 crc kubenswrapper[4914]: I0130 21:43:45.041926 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-k9mzs"] Jan 30 21:43:45 crc kubenswrapper[4914]: I0130 21:43:45.830878 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66ea5989-2da6-4da0-adb5-91da8e9e2779" path="/var/lib/kubelet/pods/66ea5989-2da6-4da0-adb5-91da8e9e2779/volumes" Jan 30 21:43:55 crc kubenswrapper[4914]: I0130 21:43:55.818169 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:43:55 crc kubenswrapper[4914]: E0130 21:43:55.818820 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:44:08 crc kubenswrapper[4914]: I0130 21:44:08.817914 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:44:08 crc kubenswrapper[4914]: E0130 21:44:08.818862 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:44:21 crc kubenswrapper[4914]: I0130 21:44:21.819205 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:44:21 crc kubenswrapper[4914]: E0130 21:44:21.819833 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:44:30 crc kubenswrapper[4914]: I0130 21:44:30.423684 4914 patch_prober.go:28] interesting pod/controller-manager-c5ffdcbcc-r5wp7 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 21:44:30 crc kubenswrapper[4914]: I0130 21:44:30.437475 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" podUID="2240b84d-6f27-4342-b042-0977e70765d8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 21:44:30 crc kubenswrapper[4914]: I0130 21:44:30.423927 4914 patch_prober.go:28] interesting pod/controller-manager-c5ffdcbcc-r5wp7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 21:44:30 crc kubenswrapper[4914]: I0130 21:44:30.439780 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-c5ffdcbcc-r5wp7" podUID="2240b84d-6f27-4342-b042-0977e70765d8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 21:44:33 crc kubenswrapper[4914]: I0130 21:44:33.506838 4914 generic.go:334] "Generic (PLEG): container finished" podID="a8bcc4f1-23fa-40da-8a45-7c89b377e6d7" containerID="0bc5ffd465b540e92012f6c2a817c1364c07965d1478006726556c7ba9ecdbab" exitCode=0 Jan 30 21:44:33 crc kubenswrapper[4914]: I0130 21:44:33.506934 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" event={"ID":"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7","Type":"ContainerDied","Data":"0bc5ffd465b540e92012f6c2a817c1364c07965d1478006726556c7ba9ecdbab"} Jan 30 21:44:34 crc kubenswrapper[4914]: I0130 21:44:34.822179 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:44:34 crc kubenswrapper[4914]: E0130 21:44:34.822906 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.050390 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.157080 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwld5\" (UniqueName: \"kubernetes.io/projected/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-kube-api-access-fwld5\") pod \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\" (UID: \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\") " Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.157128 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-ssh-key-openstack-edpm-ipam\") pod \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\" (UID: \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\") " Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.157166 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-inventory\") pod \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\" (UID: \"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7\") " Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.166962 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-kube-api-access-fwld5" (OuterVolumeSpecName: "kube-api-access-fwld5") pod "a8bcc4f1-23fa-40da-8a45-7c89b377e6d7" (UID: "a8bcc4f1-23fa-40da-8a45-7c89b377e6d7"). InnerVolumeSpecName "kube-api-access-fwld5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.192392 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a8bcc4f1-23fa-40da-8a45-7c89b377e6d7" (UID: "a8bcc4f1-23fa-40da-8a45-7c89b377e6d7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.195229 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-inventory" (OuterVolumeSpecName: "inventory") pod "a8bcc4f1-23fa-40da-8a45-7c89b377e6d7" (UID: "a8bcc4f1-23fa-40da-8a45-7c89b377e6d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.260262 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwld5\" (UniqueName: \"kubernetes.io/projected/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-kube-api-access-fwld5\") on node \"crc\" DevicePath \"\"" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.260293 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.260303 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8bcc4f1-23fa-40da-8a45-7c89b377e6d7-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.533449 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" event={"ID":"a8bcc4f1-23fa-40da-8a45-7c89b377e6d7","Type":"ContainerDied","Data":"91b0d5321b8a3768b8d78fc1d5326710c3bf2fc67066d3d0433094abaa8fd865"} Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.533497 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91b0d5321b8a3768b8d78fc1d5326710c3bf2fc67066d3d0433094abaa8fd865" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.533524 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bczr4" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.650899 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k"] Jan 30 21:44:35 crc kubenswrapper[4914]: E0130 21:44:35.651578 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8bcc4f1-23fa-40da-8a45-7c89b377e6d7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.651609 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8bcc4f1-23fa-40da-8a45-7c89b377e6d7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.652072 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8bcc4f1-23fa-40da-8a45-7c89b377e6d7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.653499 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.656686 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.657110 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.657291 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.657480 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.687655 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k"] Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.772321 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9790abb-7691-489b-a30b-84738f413edc-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-58t6k\" (UID: \"e9790abb-7691-489b-a30b-84738f413edc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.773007 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr6k2\" (UniqueName: \"kubernetes.io/projected/e9790abb-7691-489b-a30b-84738f413edc-kube-api-access-xr6k2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-58t6k\" (UID: \"e9790abb-7691-489b-a30b-84738f413edc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.773079 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9790abb-7691-489b-a30b-84738f413edc-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-58t6k\" (UID: \"e9790abb-7691-489b-a30b-84738f413edc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.875577 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9790abb-7691-489b-a30b-84738f413edc-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-58t6k\" (UID: \"e9790abb-7691-489b-a30b-84738f413edc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.875642 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr6k2\" (UniqueName: \"kubernetes.io/projected/e9790abb-7691-489b-a30b-84738f413edc-kube-api-access-xr6k2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-58t6k\" (UID: \"e9790abb-7691-489b-a30b-84738f413edc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.875677 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9790abb-7691-489b-a30b-84738f413edc-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-58t6k\" (UID: \"e9790abb-7691-489b-a30b-84738f413edc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.882385 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9790abb-7691-489b-a30b-84738f413edc-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-58t6k\" (UID: \"e9790abb-7691-489b-a30b-84738f413edc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.892233 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9790abb-7691-489b-a30b-84738f413edc-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-58t6k\" (UID: \"e9790abb-7691-489b-a30b-84738f413edc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.900052 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr6k2\" (UniqueName: \"kubernetes.io/projected/e9790abb-7691-489b-a30b-84738f413edc-kube-api-access-xr6k2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-58t6k\" (UID: \"e9790abb-7691-489b-a30b-84738f413edc\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" Jan 30 21:44:35 crc kubenswrapper[4914]: I0130 21:44:35.975485 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" Jan 30 21:44:36 crc kubenswrapper[4914]: I0130 21:44:36.518929 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k"] Jan 30 21:44:36 crc kubenswrapper[4914]: I0130 21:44:36.545041 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" event={"ID":"e9790abb-7691-489b-a30b-84738f413edc","Type":"ContainerStarted","Data":"375905a32099d0b9e23cb4bad00ebd901eea6c13392264d6a3060adfd67302f4"} Jan 30 21:44:37 crc kubenswrapper[4914]: I0130 21:44:37.556405 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" event={"ID":"e9790abb-7691-489b-a30b-84738f413edc","Type":"ContainerStarted","Data":"a11d2e8f353da5f22f0e551f7bcc8e8a84e5ec38fac8e239e52cddb92763a689"} Jan 30 21:44:37 crc kubenswrapper[4914]: I0130 21:44:37.588307 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" podStartSLOduration=2.073087603 podStartE2EDuration="2.588283883s" podCreationTimestamp="2026-01-30 21:44:35 +0000 UTC" firstStartedPulling="2026-01-30 21:44:36.521182237 +0000 UTC m=+1809.959818998" lastFinishedPulling="2026-01-30 21:44:37.036378517 +0000 UTC m=+1810.475015278" observedRunningTime="2026-01-30 21:44:37.575313063 +0000 UTC m=+1811.013949834" watchObservedRunningTime="2026-01-30 21:44:37.588283883 +0000 UTC m=+1811.026920644" Jan 30 21:44:41 crc kubenswrapper[4914]: I0130 21:44:41.065677 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5jxqn"] Jan 30 21:44:41 crc kubenswrapper[4914]: I0130 21:44:41.075775 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-9zkpj"] Jan 30 21:44:41 crc kubenswrapper[4914]: I0130 21:44:41.087203 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5jxqn"] Jan 30 21:44:41 crc kubenswrapper[4914]: I0130 21:44:41.096494 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-9zkpj"] Jan 30 21:44:41 crc kubenswrapper[4914]: I0130 21:44:41.829914 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25c531db-a02c-477b-b968-2f086a8443e8" path="/var/lib/kubelet/pods/25c531db-a02c-477b-b968-2f086a8443e8/volumes" Jan 30 21:44:41 crc kubenswrapper[4914]: I0130 21:44:41.830594 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4" path="/var/lib/kubelet/pods/5bb843c7-b3dd-494f-9eb2-ccfbf2c108c4/volumes" Jan 30 21:44:42 crc kubenswrapper[4914]: I0130 21:44:42.839819 4914 scope.go:117] "RemoveContainer" containerID="3b037bd88c1966f96490390d255c0435e12d2127123da6b735bc98001463fc16" Jan 30 21:44:42 crc kubenswrapper[4914]: I0130 21:44:42.875379 4914 scope.go:117] "RemoveContainer" containerID="b554b1f4e32799e7f2ed76f98ec781a2f9ac8d9c8e7cf091df07db8332c1a303" Jan 30 21:44:42 crc kubenswrapper[4914]: I0130 21:44:42.924083 4914 scope.go:117] "RemoveContainer" containerID="9e75279ece50bf2e3425ec5f9fd516a2954e93c69c273e1bc87e22ed259d29ad" Jan 30 21:44:48 crc kubenswrapper[4914]: I0130 21:44:48.817961 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:44:48 crc kubenswrapper[4914]: E0130 21:44:48.819988 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:44:53 crc kubenswrapper[4914]: I0130 21:44:53.030756 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-6c9nl"] Jan 30 21:44:53 crc kubenswrapper[4914]: I0130 21:44:53.039675 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-6c9nl"] Jan 30 21:44:53 crc kubenswrapper[4914]: I0130 21:44:53.832282 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4afaeee-72ae-4c47-b842-d201151915c4" path="/var/lib/kubelet/pods/f4afaeee-72ae-4c47-b842-d201151915c4/volumes" Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.154602 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk"] Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.156544 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.159951 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.159949 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.211534 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk"] Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.335861 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f13fe56d-0def-464d-94c4-1c0f51be1bf7-secret-volume\") pod \"collect-profiles-29496825-2njbk\" (UID: \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.335954 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blvz7\" (UniqueName: \"kubernetes.io/projected/f13fe56d-0def-464d-94c4-1c0f51be1bf7-kube-api-access-blvz7\") pod \"collect-profiles-29496825-2njbk\" (UID: \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.336110 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f13fe56d-0def-464d-94c4-1c0f51be1bf7-config-volume\") pod \"collect-profiles-29496825-2njbk\" (UID: \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.437404 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f13fe56d-0def-464d-94c4-1c0f51be1bf7-config-volume\") pod \"collect-profiles-29496825-2njbk\" (UID: \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.438583 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f13fe56d-0def-464d-94c4-1c0f51be1bf7-secret-volume\") pod \"collect-profiles-29496825-2njbk\" (UID: \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.439521 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blvz7\" (UniqueName: \"kubernetes.io/projected/f13fe56d-0def-464d-94c4-1c0f51be1bf7-kube-api-access-blvz7\") pod \"collect-profiles-29496825-2njbk\" (UID: \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.438454 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f13fe56d-0def-464d-94c4-1c0f51be1bf7-config-volume\") pod \"collect-profiles-29496825-2njbk\" (UID: \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.451404 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f13fe56d-0def-464d-94c4-1c0f51be1bf7-secret-volume\") pod \"collect-profiles-29496825-2njbk\" (UID: \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.460509 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blvz7\" (UniqueName: \"kubernetes.io/projected/f13fe56d-0def-464d-94c4-1c0f51be1bf7-kube-api-access-blvz7\") pod \"collect-profiles-29496825-2njbk\" (UID: \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.517948 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" Jan 30 21:45:00 crc kubenswrapper[4914]: I0130 21:45:00.999062 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk"] Jan 30 21:45:01 crc kubenswrapper[4914]: I0130 21:45:01.835214 4914 generic.go:334] "Generic (PLEG): container finished" podID="f13fe56d-0def-464d-94c4-1c0f51be1bf7" containerID="67f5fc382f8fbe389848ffeefedae4b9a1297904db3efab3b256b9cc1c07a8d4" exitCode=0 Jan 30 21:45:01 crc kubenswrapper[4914]: I0130 21:45:01.835294 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" event={"ID":"f13fe56d-0def-464d-94c4-1c0f51be1bf7","Type":"ContainerDied","Data":"67f5fc382f8fbe389848ffeefedae4b9a1297904db3efab3b256b9cc1c07a8d4"} Jan 30 21:45:01 crc kubenswrapper[4914]: I0130 21:45:01.835529 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" event={"ID":"f13fe56d-0def-464d-94c4-1c0f51be1bf7","Type":"ContainerStarted","Data":"a5854b8c0e66ed4001b3b0070be08fc61e38d9774417b2c4de8b337696f1382f"} Jan 30 21:45:02 crc kubenswrapper[4914]: I0130 21:45:02.818793 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:45:02 crc kubenswrapper[4914]: E0130 21:45:02.819397 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.060267 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-46tqv"] Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.069869 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6kskl"] Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.092672 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6kskl"] Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.100310 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-46tqv"] Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.300154 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.406567 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blvz7\" (UniqueName: \"kubernetes.io/projected/f13fe56d-0def-464d-94c4-1c0f51be1bf7-kube-api-access-blvz7\") pod \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\" (UID: \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\") " Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.406870 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f13fe56d-0def-464d-94c4-1c0f51be1bf7-secret-volume\") pod \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\" (UID: \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\") " Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.406974 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f13fe56d-0def-464d-94c4-1c0f51be1bf7-config-volume\") pod \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\" (UID: \"f13fe56d-0def-464d-94c4-1c0f51be1bf7\") " Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.407556 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f13fe56d-0def-464d-94c4-1c0f51be1bf7-config-volume" (OuterVolumeSpecName: "config-volume") pod "f13fe56d-0def-464d-94c4-1c0f51be1bf7" (UID: "f13fe56d-0def-464d-94c4-1c0f51be1bf7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.418059 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13fe56d-0def-464d-94c4-1c0f51be1bf7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f13fe56d-0def-464d-94c4-1c0f51be1bf7" (UID: "f13fe56d-0def-464d-94c4-1c0f51be1bf7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.418147 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f13fe56d-0def-464d-94c4-1c0f51be1bf7-kube-api-access-blvz7" (OuterVolumeSpecName: "kube-api-access-blvz7") pod "f13fe56d-0def-464d-94c4-1c0f51be1bf7" (UID: "f13fe56d-0def-464d-94c4-1c0f51be1bf7"). InnerVolumeSpecName "kube-api-access-blvz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.510114 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blvz7\" (UniqueName: \"kubernetes.io/projected/f13fe56d-0def-464d-94c4-1c0f51be1bf7-kube-api-access-blvz7\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.510445 4914 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f13fe56d-0def-464d-94c4-1c0f51be1bf7-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.510459 4914 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f13fe56d-0def-464d-94c4-1c0f51be1bf7-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.835105 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6748bae8-dcab-4fdb-ab49-b60893908a7f" path="/var/lib/kubelet/pods/6748bae8-dcab-4fdb-ab49-b60893908a7f/volumes" Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.837479 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7fe1c6e-0858-479f-b365-081a1b8fcf2d" path="/var/lib/kubelet/pods/b7fe1c6e-0858-479f-b365-081a1b8fcf2d/volumes" Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.863972 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" event={"ID":"f13fe56d-0def-464d-94c4-1c0f51be1bf7","Type":"ContainerDied","Data":"a5854b8c0e66ed4001b3b0070be08fc61e38d9774417b2c4de8b337696f1382f"} Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.864093 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5854b8c0e66ed4001b3b0070be08fc61e38d9774417b2c4de8b337696f1382f" Jan 30 21:45:03 crc kubenswrapper[4914]: I0130 21:45:03.864046 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-2njbk" Jan 30 21:45:15 crc kubenswrapper[4914]: I0130 21:45:15.818983 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:45:15 crc kubenswrapper[4914]: E0130 21:45:15.829001 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:45:30 crc kubenswrapper[4914]: I0130 21:45:30.819029 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:45:30 crc kubenswrapper[4914]: E0130 21:45:30.820214 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:45:42 crc kubenswrapper[4914]: I0130 21:45:42.332881 4914 generic.go:334] "Generic (PLEG): container finished" podID="e9790abb-7691-489b-a30b-84738f413edc" containerID="a11d2e8f353da5f22f0e551f7bcc8e8a84e5ec38fac8e239e52cddb92763a689" exitCode=0 Jan 30 21:45:42 crc kubenswrapper[4914]: I0130 21:45:42.332968 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" event={"ID":"e9790abb-7691-489b-a30b-84738f413edc","Type":"ContainerDied","Data":"a11d2e8f353da5f22f0e551f7bcc8e8a84e5ec38fac8e239e52cddb92763a689"} Jan 30 21:45:43 crc kubenswrapper[4914]: I0130 21:45:43.036095 4914 scope.go:117] "RemoveContainer" containerID="0f94a44c94e29ccb926b0c7329b8a2486a23c1a7486e21a995e3a0bc8eb6d347" Jan 30 21:45:43 crc kubenswrapper[4914]: I0130 21:45:43.065020 4914 scope.go:117] "RemoveContainer" containerID="7aa1e887ff1d5b5fa539da14db28a32b20602da026fc610316282f7c7f00dfdf" Jan 30 21:45:43 crc kubenswrapper[4914]: I0130 21:45:43.137410 4914 scope.go:117] "RemoveContainer" containerID="d8e8fbc6307d942d2de54b5e103cd757d9ee4b92d2a881c8a6c2adf44ff1d013" Jan 30 21:45:43 crc kubenswrapper[4914]: I0130 21:45:43.857612 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" Jan 30 21:45:43 crc kubenswrapper[4914]: I0130 21:45:43.960448 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr6k2\" (UniqueName: \"kubernetes.io/projected/e9790abb-7691-489b-a30b-84738f413edc-kube-api-access-xr6k2\") pod \"e9790abb-7691-489b-a30b-84738f413edc\" (UID: \"e9790abb-7691-489b-a30b-84738f413edc\") " Jan 30 21:45:43 crc kubenswrapper[4914]: I0130 21:45:43.960587 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9790abb-7691-489b-a30b-84738f413edc-ssh-key-openstack-edpm-ipam\") pod \"e9790abb-7691-489b-a30b-84738f413edc\" (UID: \"e9790abb-7691-489b-a30b-84738f413edc\") " Jan 30 21:45:43 crc kubenswrapper[4914]: I0130 21:45:43.960905 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9790abb-7691-489b-a30b-84738f413edc-inventory\") pod \"e9790abb-7691-489b-a30b-84738f413edc\" (UID: \"e9790abb-7691-489b-a30b-84738f413edc\") " Jan 30 21:45:43 crc kubenswrapper[4914]: I0130 21:45:43.967287 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9790abb-7691-489b-a30b-84738f413edc-kube-api-access-xr6k2" (OuterVolumeSpecName: "kube-api-access-xr6k2") pod "e9790abb-7691-489b-a30b-84738f413edc" (UID: "e9790abb-7691-489b-a30b-84738f413edc"). InnerVolumeSpecName "kube-api-access-xr6k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:45:43 crc kubenswrapper[4914]: I0130 21:45:43.993550 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9790abb-7691-489b-a30b-84738f413edc-inventory" (OuterVolumeSpecName: "inventory") pod "e9790abb-7691-489b-a30b-84738f413edc" (UID: "e9790abb-7691-489b-a30b-84738f413edc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.008457 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9790abb-7691-489b-a30b-84738f413edc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e9790abb-7691-489b-a30b-84738f413edc" (UID: "e9790abb-7691-489b-a30b-84738f413edc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.065211 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9790abb-7691-489b-a30b-84738f413edc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.065248 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9790abb-7691-489b-a30b-84738f413edc-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.065259 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr6k2\" (UniqueName: \"kubernetes.io/projected/e9790abb-7691-489b-a30b-84738f413edc-kube-api-access-xr6k2\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.357404 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" event={"ID":"e9790abb-7691-489b-a30b-84738f413edc","Type":"ContainerDied","Data":"375905a32099d0b9e23cb4bad00ebd901eea6c13392264d6a3060adfd67302f4"} Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.357448 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="375905a32099d0b9e23cb4bad00ebd901eea6c13392264d6a3060adfd67302f4" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.357587 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-58t6k" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.460443 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8"] Jan 30 21:45:44 crc kubenswrapper[4914]: E0130 21:45:44.461163 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f13fe56d-0def-464d-94c4-1c0f51be1bf7" containerName="collect-profiles" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.461182 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13fe56d-0def-464d-94c4-1c0f51be1bf7" containerName="collect-profiles" Jan 30 21:45:44 crc kubenswrapper[4914]: E0130 21:45:44.461197 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9790abb-7691-489b-a30b-84738f413edc" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.461204 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9790abb-7691-489b-a30b-84738f413edc" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.461384 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9790abb-7691-489b-a30b-84738f413edc" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.461411 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f13fe56d-0def-464d-94c4-1c0f51be1bf7" containerName="collect-profiles" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.462319 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.466039 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.466095 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.466451 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.466805 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.483214 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8"] Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.575149 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d4c7903-33ee-499b-abb7-4e029ec9f925-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8\" (UID: \"7d4c7903-33ee-499b-abb7-4e029ec9f925\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.575207 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n8h9\" (UniqueName: \"kubernetes.io/projected/7d4c7903-33ee-499b-abb7-4e029ec9f925-kube-api-access-7n8h9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8\" (UID: \"7d4c7903-33ee-499b-abb7-4e029ec9f925\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.575270 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d4c7903-33ee-499b-abb7-4e029ec9f925-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8\" (UID: \"7d4c7903-33ee-499b-abb7-4e029ec9f925\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.678100 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d4c7903-33ee-499b-abb7-4e029ec9f925-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8\" (UID: \"7d4c7903-33ee-499b-abb7-4e029ec9f925\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.678240 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n8h9\" (UniqueName: \"kubernetes.io/projected/7d4c7903-33ee-499b-abb7-4e029ec9f925-kube-api-access-7n8h9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8\" (UID: \"7d4c7903-33ee-499b-abb7-4e029ec9f925\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.678339 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d4c7903-33ee-499b-abb7-4e029ec9f925-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8\" (UID: \"7d4c7903-33ee-499b-abb7-4e029ec9f925\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.685572 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d4c7903-33ee-499b-abb7-4e029ec9f925-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8\" (UID: \"7d4c7903-33ee-499b-abb7-4e029ec9f925\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.686754 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d4c7903-33ee-499b-abb7-4e029ec9f925-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8\" (UID: \"7d4c7903-33ee-499b-abb7-4e029ec9f925\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.694916 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n8h9\" (UniqueName: \"kubernetes.io/projected/7d4c7903-33ee-499b-abb7-4e029ec9f925-kube-api-access-7n8h9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8\" (UID: \"7d4c7903-33ee-499b-abb7-4e029ec9f925\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" Jan 30 21:45:44 crc kubenswrapper[4914]: I0130 21:45:44.792571 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" Jan 30 21:45:45 crc kubenswrapper[4914]: I0130 21:45:45.353355 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8"] Jan 30 21:45:45 crc kubenswrapper[4914]: W0130 21:45:45.364052 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d4c7903_33ee_499b_abb7_4e029ec9f925.slice/crio-28d76cdb4f4be8c323710b999c4bb536bf68b6e20f721c0fdee11a481523ec8b WatchSource:0}: Error finding container 28d76cdb4f4be8c323710b999c4bb536bf68b6e20f721c0fdee11a481523ec8b: Status 404 returned error can't find the container with id 28d76cdb4f4be8c323710b999c4bb536bf68b6e20f721c0fdee11a481523ec8b Jan 30 21:45:45 crc kubenswrapper[4914]: I0130 21:45:45.366696 4914 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:45:45 crc kubenswrapper[4914]: I0130 21:45:45.818813 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:45:45 crc kubenswrapper[4914]: E0130 21:45:45.819431 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:45:46 crc kubenswrapper[4914]: I0130 21:45:46.073104 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-ch6mc"] Jan 30 21:45:46 crc kubenswrapper[4914]: I0130 21:45:46.081572 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-ch6mc"] Jan 30 21:45:46 crc kubenswrapper[4914]: I0130 21:45:46.092555 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-f69w2"] Jan 30 21:45:46 crc kubenswrapper[4914]: I0130 21:45:46.103085 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-f69w2"] Jan 30 21:45:46 crc kubenswrapper[4914]: I0130 21:45:46.385733 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" event={"ID":"7d4c7903-33ee-499b-abb7-4e029ec9f925","Type":"ContainerStarted","Data":"14c1d77e3a2d0c0a0f2553ea4002a2e5e7d1157346fc67f06a4a325d969bddf1"} Jan 30 21:45:46 crc kubenswrapper[4914]: I0130 21:45:46.385775 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" event={"ID":"7d4c7903-33ee-499b-abb7-4e029ec9f925","Type":"ContainerStarted","Data":"28d76cdb4f4be8c323710b999c4bb536bf68b6e20f721c0fdee11a481523ec8b"} Jan 30 21:45:46 crc kubenswrapper[4914]: I0130 21:45:46.412618 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" podStartSLOduration=1.972095968 podStartE2EDuration="2.412585503s" podCreationTimestamp="2026-01-30 21:45:44 +0000 UTC" firstStartedPulling="2026-01-30 21:45:45.366403814 +0000 UTC m=+1878.805040585" lastFinishedPulling="2026-01-30 21:45:45.806893359 +0000 UTC m=+1879.245530120" observedRunningTime="2026-01-30 21:45:46.40598582 +0000 UTC m=+1879.844622611" watchObservedRunningTime="2026-01-30 21:45:46.412585503 +0000 UTC m=+1879.851222314" Jan 30 21:45:47 crc kubenswrapper[4914]: I0130 21:45:47.034723 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9ea2-account-create-update-7nt6g"] Jan 30 21:45:47 crc kubenswrapper[4914]: I0130 21:45:47.047337 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-a41a-account-create-update-55jbl"] Jan 30 21:45:47 crc kubenswrapper[4914]: I0130 21:45:47.061351 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2c59-account-create-update-ndrfd"] Jan 30 21:45:47 crc kubenswrapper[4914]: I0130 21:45:47.073136 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-dpz67"] Jan 30 21:45:47 crc kubenswrapper[4914]: I0130 21:45:47.093609 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9ea2-account-create-update-7nt6g"] Jan 30 21:45:47 crc kubenswrapper[4914]: I0130 21:45:47.102643 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-dpz67"] Jan 30 21:45:47 crc kubenswrapper[4914]: I0130 21:45:47.111378 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2c59-account-create-update-ndrfd"] Jan 30 21:45:47 crc kubenswrapper[4914]: I0130 21:45:47.118716 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-a41a-account-create-update-55jbl"] Jan 30 21:45:47 crc kubenswrapper[4914]: I0130 21:45:47.831552 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a7e1e76-75e3-4193-b755-1b044debf71f" path="/var/lib/kubelet/pods/3a7e1e76-75e3-4193-b755-1b044debf71f/volumes" Jan 30 21:45:47 crc kubenswrapper[4914]: I0130 21:45:47.833430 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40e7c831-3c33-429a-ac8a-e7768226c344" path="/var/lib/kubelet/pods/40e7c831-3c33-429a-ac8a-e7768226c344/volumes" Jan 30 21:45:47 crc kubenswrapper[4914]: I0130 21:45:47.834755 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bed7ff8-1ca4-47d8-bb12-a13840612182" path="/var/lib/kubelet/pods/5bed7ff8-1ca4-47d8-bb12-a13840612182/volumes" Jan 30 21:45:47 crc kubenswrapper[4914]: I0130 21:45:47.835937 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2dbf656-0687-4199-8101-c25fb82801e8" path="/var/lib/kubelet/pods/d2dbf656-0687-4199-8101-c25fb82801e8/volumes" Jan 30 21:45:47 crc kubenswrapper[4914]: I0130 21:45:47.838168 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db5c9d5a-21ac-4c8c-a108-c6752014ec58" path="/var/lib/kubelet/pods/db5c9d5a-21ac-4c8c-a108-c6752014ec58/volumes" Jan 30 21:45:47 crc kubenswrapper[4914]: I0130 21:45:47.839253 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5edf659-0359-4a9c-aca0-d95f3ac8c57e" path="/var/lib/kubelet/pods/e5edf659-0359-4a9c-aca0-d95f3ac8c57e/volumes" Jan 30 21:45:51 crc kubenswrapper[4914]: I0130 21:45:51.442768 4914 generic.go:334] "Generic (PLEG): container finished" podID="7d4c7903-33ee-499b-abb7-4e029ec9f925" containerID="14c1d77e3a2d0c0a0f2553ea4002a2e5e7d1157346fc67f06a4a325d969bddf1" exitCode=0 Jan 30 21:45:51 crc kubenswrapper[4914]: I0130 21:45:51.442877 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" event={"ID":"7d4c7903-33ee-499b-abb7-4e029ec9f925","Type":"ContainerDied","Data":"14c1d77e3a2d0c0a0f2553ea4002a2e5e7d1157346fc67f06a4a325d969bddf1"} Jan 30 21:45:52 crc kubenswrapper[4914]: I0130 21:45:52.946240 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.065553 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d4c7903-33ee-499b-abb7-4e029ec9f925-ssh-key-openstack-edpm-ipam\") pod \"7d4c7903-33ee-499b-abb7-4e029ec9f925\" (UID: \"7d4c7903-33ee-499b-abb7-4e029ec9f925\") " Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.065645 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n8h9\" (UniqueName: \"kubernetes.io/projected/7d4c7903-33ee-499b-abb7-4e029ec9f925-kube-api-access-7n8h9\") pod \"7d4c7903-33ee-499b-abb7-4e029ec9f925\" (UID: \"7d4c7903-33ee-499b-abb7-4e029ec9f925\") " Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.065951 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d4c7903-33ee-499b-abb7-4e029ec9f925-inventory\") pod \"7d4c7903-33ee-499b-abb7-4e029ec9f925\" (UID: \"7d4c7903-33ee-499b-abb7-4e029ec9f925\") " Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.072231 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d4c7903-33ee-499b-abb7-4e029ec9f925-kube-api-access-7n8h9" (OuterVolumeSpecName: "kube-api-access-7n8h9") pod "7d4c7903-33ee-499b-abb7-4e029ec9f925" (UID: "7d4c7903-33ee-499b-abb7-4e029ec9f925"). InnerVolumeSpecName "kube-api-access-7n8h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.100095 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4c7903-33ee-499b-abb7-4e029ec9f925-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7d4c7903-33ee-499b-abb7-4e029ec9f925" (UID: "7d4c7903-33ee-499b-abb7-4e029ec9f925"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.100556 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4c7903-33ee-499b-abb7-4e029ec9f925-inventory" (OuterVolumeSpecName: "inventory") pod "7d4c7903-33ee-499b-abb7-4e029ec9f925" (UID: "7d4c7903-33ee-499b-abb7-4e029ec9f925"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.168464 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d4c7903-33ee-499b-abb7-4e029ec9f925-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.168506 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n8h9\" (UniqueName: \"kubernetes.io/projected/7d4c7903-33ee-499b-abb7-4e029ec9f925-kube-api-access-7n8h9\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.168516 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d4c7903-33ee-499b-abb7-4e029ec9f925-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.469517 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" event={"ID":"7d4c7903-33ee-499b-abb7-4e029ec9f925","Type":"ContainerDied","Data":"28d76cdb4f4be8c323710b999c4bb536bf68b6e20f721c0fdee11a481523ec8b"} Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.469563 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28d76cdb4f4be8c323710b999c4bb536bf68b6e20f721c0fdee11a481523ec8b" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.469611 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.656062 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4"] Jan 30 21:45:53 crc kubenswrapper[4914]: E0130 21:45:53.658859 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4c7903-33ee-499b-abb7-4e029ec9f925" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.658884 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4c7903-33ee-499b-abb7-4e029ec9f925" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.659461 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4c7903-33ee-499b-abb7-4e029ec9f925" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.677533 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4"] Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.678543 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.683673 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.684314 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.684407 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.684643 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.788125 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e0c921-ba72-45cf-b9b3-1b28148761d4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ks7d4\" (UID: \"51e0c921-ba72-45cf-b9b3-1b28148761d4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.788200 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpdql\" (UniqueName: \"kubernetes.io/projected/51e0c921-ba72-45cf-b9b3-1b28148761d4-kube-api-access-mpdql\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ks7d4\" (UID: \"51e0c921-ba72-45cf-b9b3-1b28148761d4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.788305 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51e0c921-ba72-45cf-b9b3-1b28148761d4-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ks7d4\" (UID: \"51e0c921-ba72-45cf-b9b3-1b28148761d4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.890293 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51e0c921-ba72-45cf-b9b3-1b28148761d4-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ks7d4\" (UID: \"51e0c921-ba72-45cf-b9b3-1b28148761d4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.890664 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e0c921-ba72-45cf-b9b3-1b28148761d4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ks7d4\" (UID: \"51e0c921-ba72-45cf-b9b3-1b28148761d4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.890797 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpdql\" (UniqueName: \"kubernetes.io/projected/51e0c921-ba72-45cf-b9b3-1b28148761d4-kube-api-access-mpdql\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ks7d4\" (UID: \"51e0c921-ba72-45cf-b9b3-1b28148761d4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.896412 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e0c921-ba72-45cf-b9b3-1b28148761d4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ks7d4\" (UID: \"51e0c921-ba72-45cf-b9b3-1b28148761d4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.899453 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51e0c921-ba72-45cf-b9b3-1b28148761d4-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ks7d4\" (UID: \"51e0c921-ba72-45cf-b9b3-1b28148761d4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" Jan 30 21:45:53 crc kubenswrapper[4914]: I0130 21:45:53.920904 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpdql\" (UniqueName: \"kubernetes.io/projected/51e0c921-ba72-45cf-b9b3-1b28148761d4-kube-api-access-mpdql\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ks7d4\" (UID: \"51e0c921-ba72-45cf-b9b3-1b28148761d4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" Jan 30 21:45:54 crc kubenswrapper[4914]: I0130 21:45:54.016228 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" Jan 30 21:45:54 crc kubenswrapper[4914]: I0130 21:45:54.549004 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4"] Jan 30 21:45:55 crc kubenswrapper[4914]: I0130 21:45:55.487519 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" event={"ID":"51e0c921-ba72-45cf-b9b3-1b28148761d4","Type":"ContainerStarted","Data":"bfb17d0099d45703da2e093d9bf9c4c2cf94d157e1ffa75c1229abeaa37e6e85"} Jan 30 21:45:55 crc kubenswrapper[4914]: I0130 21:45:55.487985 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" event={"ID":"51e0c921-ba72-45cf-b9b3-1b28148761d4","Type":"ContainerStarted","Data":"7ba7e4829279ccd88cf90874bd304fff04a205323971a0ed2b543682178e7521"} Jan 30 21:45:55 crc kubenswrapper[4914]: I0130 21:45:55.528516 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" podStartSLOduration=2.1553068619999998 podStartE2EDuration="2.528489794s" podCreationTimestamp="2026-01-30 21:45:53 +0000 UTC" firstStartedPulling="2026-01-30 21:45:54.555150554 +0000 UTC m=+1887.993787315" lastFinishedPulling="2026-01-30 21:45:54.928333456 +0000 UTC m=+1888.366970247" observedRunningTime="2026-01-30 21:45:55.514833626 +0000 UTC m=+1888.953470397" watchObservedRunningTime="2026-01-30 21:45:55.528489794 +0000 UTC m=+1888.967126585" Jan 30 21:45:59 crc kubenswrapper[4914]: I0130 21:45:59.819932 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:45:59 crc kubenswrapper[4914]: E0130 21:45:59.820997 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:46:11 crc kubenswrapper[4914]: I0130 21:46:11.818600 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:46:11 crc kubenswrapper[4914]: E0130 21:46:11.819390 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:46:22 crc kubenswrapper[4914]: I0130 21:46:22.818307 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:46:22 crc kubenswrapper[4914]: E0130 21:46:22.819059 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:46:23 crc kubenswrapper[4914]: I0130 21:46:23.038514 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h8vzt"] Jan 30 21:46:23 crc kubenswrapper[4914]: I0130 21:46:23.047211 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h8vzt"] Jan 30 21:46:23 crc kubenswrapper[4914]: I0130 21:46:23.847857 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39e5479a-5cbf-479b-b3b9-3af2a3424492" path="/var/lib/kubelet/pods/39e5479a-5cbf-479b-b3b9-3af2a3424492/volumes" Jan 30 21:46:30 crc kubenswrapper[4914]: I0130 21:46:30.844294 4914 generic.go:334] "Generic (PLEG): container finished" podID="51e0c921-ba72-45cf-b9b3-1b28148761d4" containerID="bfb17d0099d45703da2e093d9bf9c4c2cf94d157e1ffa75c1229abeaa37e6e85" exitCode=0 Jan 30 21:46:30 crc kubenswrapper[4914]: I0130 21:46:30.844405 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" event={"ID":"51e0c921-ba72-45cf-b9b3-1b28148761d4","Type":"ContainerDied","Data":"bfb17d0099d45703da2e093d9bf9c4c2cf94d157e1ffa75c1229abeaa37e6e85"} Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.355839 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.469981 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpdql\" (UniqueName: \"kubernetes.io/projected/51e0c921-ba72-45cf-b9b3-1b28148761d4-kube-api-access-mpdql\") pod \"51e0c921-ba72-45cf-b9b3-1b28148761d4\" (UID: \"51e0c921-ba72-45cf-b9b3-1b28148761d4\") " Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.470173 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e0c921-ba72-45cf-b9b3-1b28148761d4-inventory\") pod \"51e0c921-ba72-45cf-b9b3-1b28148761d4\" (UID: \"51e0c921-ba72-45cf-b9b3-1b28148761d4\") " Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.470286 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51e0c921-ba72-45cf-b9b3-1b28148761d4-ssh-key-openstack-edpm-ipam\") pod \"51e0c921-ba72-45cf-b9b3-1b28148761d4\" (UID: \"51e0c921-ba72-45cf-b9b3-1b28148761d4\") " Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.475679 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51e0c921-ba72-45cf-b9b3-1b28148761d4-kube-api-access-mpdql" (OuterVolumeSpecName: "kube-api-access-mpdql") pod "51e0c921-ba72-45cf-b9b3-1b28148761d4" (UID: "51e0c921-ba72-45cf-b9b3-1b28148761d4"). InnerVolumeSpecName "kube-api-access-mpdql". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.499174 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51e0c921-ba72-45cf-b9b3-1b28148761d4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "51e0c921-ba72-45cf-b9b3-1b28148761d4" (UID: "51e0c921-ba72-45cf-b9b3-1b28148761d4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.499309 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51e0c921-ba72-45cf-b9b3-1b28148761d4-inventory" (OuterVolumeSpecName: "inventory") pod "51e0c921-ba72-45cf-b9b3-1b28148761d4" (UID: "51e0c921-ba72-45cf-b9b3-1b28148761d4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.572634 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e0c921-ba72-45cf-b9b3-1b28148761d4-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.572918 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51e0c921-ba72-45cf-b9b3-1b28148761d4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.573076 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpdql\" (UniqueName: \"kubernetes.io/projected/51e0c921-ba72-45cf-b9b3-1b28148761d4-kube-api-access-mpdql\") on node \"crc\" DevicePath \"\"" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.863483 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" event={"ID":"51e0c921-ba72-45cf-b9b3-1b28148761d4","Type":"ContainerDied","Data":"7ba7e4829279ccd88cf90874bd304fff04a205323971a0ed2b543682178e7521"} Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.863519 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ba7e4829279ccd88cf90874bd304fff04a205323971a0ed2b543682178e7521" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.863549 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ks7d4" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.952569 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4"] Jan 30 21:46:32 crc kubenswrapper[4914]: E0130 21:46:32.953109 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51e0c921-ba72-45cf-b9b3-1b28148761d4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.953131 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="51e0c921-ba72-45cf-b9b3-1b28148761d4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.953435 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="51e0c921-ba72-45cf-b9b3-1b28148761d4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.954367 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.956188 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.956659 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.957093 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.957185 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:46:32 crc kubenswrapper[4914]: I0130 21:46:32.965235 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4"] Jan 30 21:46:33 crc kubenswrapper[4914]: I0130 21:46:33.084168 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4\" (UID: \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" Jan 30 21:46:33 crc kubenswrapper[4914]: I0130 21:46:33.084487 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25h7d\" (UniqueName: \"kubernetes.io/projected/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-kube-api-access-25h7d\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4\" (UID: \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" Jan 30 21:46:33 crc kubenswrapper[4914]: I0130 21:46:33.084566 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4\" (UID: \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" Jan 30 21:46:33 crc kubenswrapper[4914]: I0130 21:46:33.186268 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4\" (UID: \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" Jan 30 21:46:33 crc kubenswrapper[4914]: I0130 21:46:33.186355 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25h7d\" (UniqueName: \"kubernetes.io/projected/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-kube-api-access-25h7d\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4\" (UID: \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" Jan 30 21:46:33 crc kubenswrapper[4914]: I0130 21:46:33.186462 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4\" (UID: \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" Jan 30 21:46:33 crc kubenswrapper[4914]: I0130 21:46:33.194568 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4\" (UID: \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" Jan 30 21:46:33 crc kubenswrapper[4914]: I0130 21:46:33.202482 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4\" (UID: \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" Jan 30 21:46:33 crc kubenswrapper[4914]: I0130 21:46:33.209864 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25h7d\" (UniqueName: \"kubernetes.io/projected/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-kube-api-access-25h7d\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4\" (UID: \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" Jan 30 21:46:33 crc kubenswrapper[4914]: I0130 21:46:33.288947 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" Jan 30 21:46:33 crc kubenswrapper[4914]: I0130 21:46:33.862504 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4"] Jan 30 21:46:33 crc kubenswrapper[4914]: I0130 21:46:33.872790 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" event={"ID":"9717f88f-15d4-4b1d-93dd-dc656e5c64f6","Type":"ContainerStarted","Data":"d970a99e899bf6d17b5650f1d73dfd043c2ff63724d60bc4777f2e65a458cc66"} Jan 30 21:46:34 crc kubenswrapper[4914]: I0130 21:46:34.818611 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:46:34 crc kubenswrapper[4914]: E0130 21:46:34.819214 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:46:34 crc kubenswrapper[4914]: I0130 21:46:34.885777 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" event={"ID":"9717f88f-15d4-4b1d-93dd-dc656e5c64f6","Type":"ContainerStarted","Data":"b5f8f3deb5369942d6d09d404eee975f0660b242bc157b4e22447e7bdc02a808"} Jan 30 21:46:34 crc kubenswrapper[4914]: I0130 21:46:34.918267 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" podStartSLOduration=2.52407401 podStartE2EDuration="2.918246211s" podCreationTimestamp="2026-01-30 21:46:32 +0000 UTC" firstStartedPulling="2026-01-30 21:46:33.864232719 +0000 UTC m=+1927.302869480" lastFinishedPulling="2026-01-30 21:46:34.25840492 +0000 UTC m=+1927.697041681" observedRunningTime="2026-01-30 21:46:34.904182744 +0000 UTC m=+1928.342819535" watchObservedRunningTime="2026-01-30 21:46:34.918246211 +0000 UTC m=+1928.356882972" Jan 30 21:46:43 crc kubenswrapper[4914]: I0130 21:46:43.250346 4914 scope.go:117] "RemoveContainer" containerID="54f09901f3d592d392b9f60149550d4d8d9b2c954734d003718dedf616083feb" Jan 30 21:46:43 crc kubenswrapper[4914]: I0130 21:46:43.276792 4914 scope.go:117] "RemoveContainer" containerID="4181bf5d67c4491e0b98ab6194e9b73e98dbf6f8d7cb4d2fb6b29c00602ebee0" Jan 30 21:46:43 crc kubenswrapper[4914]: I0130 21:46:43.365193 4914 scope.go:117] "RemoveContainer" containerID="0de72de9e1c09ecfa7bb5ad9c1b252f5a872040b65cefa8c01285030bee47686" Jan 30 21:46:43 crc kubenswrapper[4914]: I0130 21:46:43.427952 4914 scope.go:117] "RemoveContainer" containerID="776fac43153567a3aa1030f424c12e091b83ad9827ad7d122a14748bcce990d5" Jan 30 21:46:43 crc kubenswrapper[4914]: I0130 21:46:43.495967 4914 scope.go:117] "RemoveContainer" containerID="c89079d181f700e285a12c4da14f91a2f92d5183f8b8a9118aa4c3e9477fc6b4" Jan 30 21:46:43 crc kubenswrapper[4914]: I0130 21:46:43.550914 4914 scope.go:117] "RemoveContainer" containerID="f92dcde47eb4f286d8ab920088365901dd6531f26bf0caa94de77959d2a73815" Jan 30 21:46:43 crc kubenswrapper[4914]: I0130 21:46:43.604341 4914 scope.go:117] "RemoveContainer" containerID="8ce7ef54868ef9255cfb283abe8c8c4e449f44ab869cda4d58d7586213af5475" Jan 30 21:46:45 crc kubenswrapper[4914]: I0130 21:46:45.819399 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:46:45 crc kubenswrapper[4914]: E0130 21:46:45.820324 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:46:46 crc kubenswrapper[4914]: I0130 21:46:46.062427 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-jwsg7"] Jan 30 21:46:46 crc kubenswrapper[4914]: I0130 21:46:46.071565 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-jwsg7"] Jan 30 21:46:47 crc kubenswrapper[4914]: I0130 21:46:47.033592 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rq48f"] Jan 30 21:46:47 crc kubenswrapper[4914]: I0130 21:46:47.044540 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rq48f"] Jan 30 21:46:47 crc kubenswrapper[4914]: I0130 21:46:47.839026 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53f1479b-3e35-4ffb-81ca-4bc42fb0d36b" path="/var/lib/kubelet/pods/53f1479b-3e35-4ffb-81ca-4bc42fb0d36b/volumes" Jan 30 21:46:47 crc kubenswrapper[4914]: I0130 21:46:47.840493 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa0fec57-7f38-455f-85d0-47b90e552b48" path="/var/lib/kubelet/pods/aa0fec57-7f38-455f-85d0-47b90e552b48/volumes" Jan 30 21:46:58 crc kubenswrapper[4914]: I0130 21:46:58.818625 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:46:58 crc kubenswrapper[4914]: E0130 21:46:58.831042 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:47:11 crc kubenswrapper[4914]: I0130 21:47:11.819505 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:47:11 crc kubenswrapper[4914]: E0130 21:47:11.820661 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:47:22 crc kubenswrapper[4914]: I0130 21:47:22.423106 4914 generic.go:334] "Generic (PLEG): container finished" podID="9717f88f-15d4-4b1d-93dd-dc656e5c64f6" containerID="b5f8f3deb5369942d6d09d404eee975f0660b242bc157b4e22447e7bdc02a808" exitCode=0 Jan 30 21:47:22 crc kubenswrapper[4914]: I0130 21:47:22.423208 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" event={"ID":"9717f88f-15d4-4b1d-93dd-dc656e5c64f6","Type":"ContainerDied","Data":"b5f8f3deb5369942d6d09d404eee975f0660b242bc157b4e22447e7bdc02a808"} Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.011746 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.152281 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25h7d\" (UniqueName: \"kubernetes.io/projected/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-kube-api-access-25h7d\") pod \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\" (UID: \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\") " Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.152371 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-ssh-key-openstack-edpm-ipam\") pod \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\" (UID: \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\") " Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.152416 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-inventory\") pod \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\" (UID: \"9717f88f-15d4-4b1d-93dd-dc656e5c64f6\") " Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.172472 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-kube-api-access-25h7d" (OuterVolumeSpecName: "kube-api-access-25h7d") pod "9717f88f-15d4-4b1d-93dd-dc656e5c64f6" (UID: "9717f88f-15d4-4b1d-93dd-dc656e5c64f6"). InnerVolumeSpecName "kube-api-access-25h7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.181550 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-inventory" (OuterVolumeSpecName: "inventory") pod "9717f88f-15d4-4b1d-93dd-dc656e5c64f6" (UID: "9717f88f-15d4-4b1d-93dd-dc656e5c64f6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.190371 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9717f88f-15d4-4b1d-93dd-dc656e5c64f6" (UID: "9717f88f-15d4-4b1d-93dd-dc656e5c64f6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.255362 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.255400 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.255415 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25h7d\" (UniqueName: \"kubernetes.io/projected/9717f88f-15d4-4b1d-93dd-dc656e5c64f6-kube-api-access-25h7d\") on node \"crc\" DevicePath \"\"" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.447159 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" event={"ID":"9717f88f-15d4-4b1d-93dd-dc656e5c64f6","Type":"ContainerDied","Data":"d970a99e899bf6d17b5650f1d73dfd043c2ff63724d60bc4777f2e65a458cc66"} Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.447199 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d970a99e899bf6d17b5650f1d73dfd043c2ff63724d60bc4777f2e65a458cc66" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.447279 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.533544 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qrk98"] Jan 30 21:47:24 crc kubenswrapper[4914]: E0130 21:47:24.533984 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9717f88f-15d4-4b1d-93dd-dc656e5c64f6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.534004 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9717f88f-15d4-4b1d-93dd-dc656e5c64f6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.534209 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="9717f88f-15d4-4b1d-93dd-dc656e5c64f6" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.534930 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.536972 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.537218 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.538659 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.538863 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.546136 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qrk98"] Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.663817 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/289a7e75-7ac0-4b40-9080-82dd58f5c81d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qrk98\" (UID: \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.663911 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/289a7e75-7ac0-4b40-9080-82dd58f5c81d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qrk98\" (UID: \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.663941 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-255km\" (UniqueName: \"kubernetes.io/projected/289a7e75-7ac0-4b40-9080-82dd58f5c81d-kube-api-access-255km\") pod \"ssh-known-hosts-edpm-deployment-qrk98\" (UID: \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.765534 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/289a7e75-7ac0-4b40-9080-82dd58f5c81d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qrk98\" (UID: \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.765577 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-255km\" (UniqueName: \"kubernetes.io/projected/289a7e75-7ac0-4b40-9080-82dd58f5c81d-kube-api-access-255km\") pod \"ssh-known-hosts-edpm-deployment-qrk98\" (UID: \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.765742 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/289a7e75-7ac0-4b40-9080-82dd58f5c81d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qrk98\" (UID: \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.769492 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/289a7e75-7ac0-4b40-9080-82dd58f5c81d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qrk98\" (UID: \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.769757 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/289a7e75-7ac0-4b40-9080-82dd58f5c81d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qrk98\" (UID: \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.787656 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-255km\" (UniqueName: \"kubernetes.io/projected/289a7e75-7ac0-4b40-9080-82dd58f5c81d-kube-api-access-255km\") pod \"ssh-known-hosts-edpm-deployment-qrk98\" (UID: \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\") " pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.818029 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:47:24 crc kubenswrapper[4914]: E0130 21:47:24.818333 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:47:24 crc kubenswrapper[4914]: I0130 21:47:24.853179 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" Jan 30 21:47:25 crc kubenswrapper[4914]: W0130 21:47:25.449134 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod289a7e75_7ac0_4b40_9080_82dd58f5c81d.slice/crio-6572bab468d38d758c86c4cd8f850bc6134bc58d1908613daa2e49a089941616 WatchSource:0}: Error finding container 6572bab468d38d758c86c4cd8f850bc6134bc58d1908613daa2e49a089941616: Status 404 returned error can't find the container with id 6572bab468d38d758c86c4cd8f850bc6134bc58d1908613daa2e49a089941616 Jan 30 21:47:25 crc kubenswrapper[4914]: I0130 21:47:25.458674 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qrk98"] Jan 30 21:47:26 crc kubenswrapper[4914]: I0130 21:47:26.474374 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" event={"ID":"289a7e75-7ac0-4b40-9080-82dd58f5c81d","Type":"ContainerStarted","Data":"6572bab468d38d758c86c4cd8f850bc6134bc58d1908613daa2e49a089941616"} Jan 30 21:47:28 crc kubenswrapper[4914]: I0130 21:47:28.494230 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" event={"ID":"289a7e75-7ac0-4b40-9080-82dd58f5c81d","Type":"ContainerStarted","Data":"35b2c18ce27385af9f907f6ba63f0da3f7fcaa521b641c1f12ce1e12133be380"} Jan 30 21:47:28 crc kubenswrapper[4914]: I0130 21:47:28.510063 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" podStartSLOduration=2.713494185 podStartE2EDuration="4.51003938s" podCreationTimestamp="2026-01-30 21:47:24 +0000 UTC" firstStartedPulling="2026-01-30 21:47:25.454498713 +0000 UTC m=+1978.893135524" lastFinishedPulling="2026-01-30 21:47:27.251043938 +0000 UTC m=+1980.689680719" observedRunningTime="2026-01-30 21:47:28.509088057 +0000 UTC m=+1981.947724838" watchObservedRunningTime="2026-01-30 21:47:28.51003938 +0000 UTC m=+1981.948676151" Jan 30 21:47:33 crc kubenswrapper[4914]: I0130 21:47:33.049803 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-rtrh8"] Jan 30 21:47:33 crc kubenswrapper[4914]: I0130 21:47:33.058538 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-rtrh8"] Jan 30 21:47:33 crc kubenswrapper[4914]: I0130 21:47:33.832026 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62190399-de6a-48f4-ba20-2458736b5b37" path="/var/lib/kubelet/pods/62190399-de6a-48f4-ba20-2458736b5b37/volumes" Jan 30 21:47:35 crc kubenswrapper[4914]: I0130 21:47:35.563518 4914 generic.go:334] "Generic (PLEG): container finished" podID="289a7e75-7ac0-4b40-9080-82dd58f5c81d" containerID="35b2c18ce27385af9f907f6ba63f0da3f7fcaa521b641c1f12ce1e12133be380" exitCode=0 Jan 30 21:47:35 crc kubenswrapper[4914]: I0130 21:47:35.563617 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" event={"ID":"289a7e75-7ac0-4b40-9080-82dd58f5c81d","Type":"ContainerDied","Data":"35b2c18ce27385af9f907f6ba63f0da3f7fcaa521b641c1f12ce1e12133be380"} Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.034578 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.126479 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-255km\" (UniqueName: \"kubernetes.io/projected/289a7e75-7ac0-4b40-9080-82dd58f5c81d-kube-api-access-255km\") pod \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\" (UID: \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\") " Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.126628 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/289a7e75-7ac0-4b40-9080-82dd58f5c81d-inventory-0\") pod \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\" (UID: \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\") " Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.126658 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/289a7e75-7ac0-4b40-9080-82dd58f5c81d-ssh-key-openstack-edpm-ipam\") pod \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\" (UID: \"289a7e75-7ac0-4b40-9080-82dd58f5c81d\") " Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.132061 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/289a7e75-7ac0-4b40-9080-82dd58f5c81d-kube-api-access-255km" (OuterVolumeSpecName: "kube-api-access-255km") pod "289a7e75-7ac0-4b40-9080-82dd58f5c81d" (UID: "289a7e75-7ac0-4b40-9080-82dd58f5c81d"). InnerVolumeSpecName "kube-api-access-255km". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.153681 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/289a7e75-7ac0-4b40-9080-82dd58f5c81d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "289a7e75-7ac0-4b40-9080-82dd58f5c81d" (UID: "289a7e75-7ac0-4b40-9080-82dd58f5c81d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.162016 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/289a7e75-7ac0-4b40-9080-82dd58f5c81d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "289a7e75-7ac0-4b40-9080-82dd58f5c81d" (UID: "289a7e75-7ac0-4b40-9080-82dd58f5c81d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.229831 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-255km\" (UniqueName: \"kubernetes.io/projected/289a7e75-7ac0-4b40-9080-82dd58f5c81d-kube-api-access-255km\") on node \"crc\" DevicePath \"\"" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.229877 4914 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/289a7e75-7ac0-4b40-9080-82dd58f5c81d-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.229894 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/289a7e75-7ac0-4b40-9080-82dd58f5c81d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.588895 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" event={"ID":"289a7e75-7ac0-4b40-9080-82dd58f5c81d","Type":"ContainerDied","Data":"6572bab468d38d758c86c4cd8f850bc6134bc58d1908613daa2e49a089941616"} Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.588954 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6572bab468d38d758c86c4cd8f850bc6134bc58d1908613daa2e49a089941616" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.589050 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qrk98" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.678799 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5"] Jan 30 21:47:37 crc kubenswrapper[4914]: E0130 21:47:37.679302 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="289a7e75-7ac0-4b40-9080-82dd58f5c81d" containerName="ssh-known-hosts-edpm-deployment" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.679321 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="289a7e75-7ac0-4b40-9080-82dd58f5c81d" containerName="ssh-known-hosts-edpm-deployment" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.679558 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="289a7e75-7ac0-4b40-9080-82dd58f5c81d" containerName="ssh-known-hosts-edpm-deployment" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.680325 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.684385 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.684421 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.684798 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.684800 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.696424 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5"] Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.742004 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2pqn\" (UniqueName: \"kubernetes.io/projected/f8711449-6cb3-4b5e-9b13-618cb27f35dc-kube-api-access-n2pqn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z2rh5\" (UID: \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.742122 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8711449-6cb3-4b5e-9b13-618cb27f35dc-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z2rh5\" (UID: \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.742234 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f8711449-6cb3-4b5e-9b13-618cb27f35dc-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z2rh5\" (UID: \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.825443 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.843927 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2pqn\" (UniqueName: \"kubernetes.io/projected/f8711449-6cb3-4b5e-9b13-618cb27f35dc-kube-api-access-n2pqn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z2rh5\" (UID: \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.844035 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8711449-6cb3-4b5e-9b13-618cb27f35dc-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z2rh5\" (UID: \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.844123 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f8711449-6cb3-4b5e-9b13-618cb27f35dc-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z2rh5\" (UID: \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.848654 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f8711449-6cb3-4b5e-9b13-618cb27f35dc-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z2rh5\" (UID: \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.853204 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8711449-6cb3-4b5e-9b13-618cb27f35dc-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z2rh5\" (UID: \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" Jan 30 21:47:37 crc kubenswrapper[4914]: I0130 21:47:37.860363 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2pqn\" (UniqueName: \"kubernetes.io/projected/f8711449-6cb3-4b5e-9b13-618cb27f35dc-kube-api-access-n2pqn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-z2rh5\" (UID: \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" Jan 30 21:47:38 crc kubenswrapper[4914]: I0130 21:47:38.051808 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" Jan 30 21:47:38 crc kubenswrapper[4914]: I0130 21:47:38.617016 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5"] Jan 30 21:47:38 crc kubenswrapper[4914]: I0130 21:47:38.620122 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"2beeffdd2e3a30f174e411bd48f6951bdc1c5b950b8351ad0c9f10106fc74a69"} Jan 30 21:47:39 crc kubenswrapper[4914]: I0130 21:47:39.647881 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" event={"ID":"f8711449-6cb3-4b5e-9b13-618cb27f35dc","Type":"ContainerStarted","Data":"7808683fb29808f4a212366a73347ccc3eb9399202ff41ad82a9d5b5a06aec73"} Jan 30 21:47:40 crc kubenswrapper[4914]: I0130 21:47:40.658801 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" event={"ID":"f8711449-6cb3-4b5e-9b13-618cb27f35dc","Type":"ContainerStarted","Data":"b84c077e8f618d802b5df30592ca46056cd9d7887eefeb3479e7e07e6f0d01e3"} Jan 30 21:47:43 crc kubenswrapper[4914]: I0130 21:47:43.756779 4914 scope.go:117] "RemoveContainer" containerID="900fcfdcc8d7e2ca1fb8bbb34b36214eafb90607ae9bb890b1d4505d60d8bf3d" Jan 30 21:47:43 crc kubenswrapper[4914]: I0130 21:47:43.798795 4914 scope.go:117] "RemoveContainer" containerID="04f6826f8e64a5a3980bb9d298190a1a13ef171bbe8a077a3f54e49b7502d7c0" Jan 30 21:47:43 crc kubenswrapper[4914]: I0130 21:47:43.861631 4914 scope.go:117] "RemoveContainer" containerID="3d9580134d260a1fdfdbba2e135073cb383e926e2e20e883e77c8ea6e25dc6c8" Jan 30 21:47:48 crc kubenswrapper[4914]: I0130 21:47:48.733926 4914 generic.go:334] "Generic (PLEG): container finished" podID="f8711449-6cb3-4b5e-9b13-618cb27f35dc" containerID="b84c077e8f618d802b5df30592ca46056cd9d7887eefeb3479e7e07e6f0d01e3" exitCode=0 Jan 30 21:47:48 crc kubenswrapper[4914]: I0130 21:47:48.734015 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" event={"ID":"f8711449-6cb3-4b5e-9b13-618cb27f35dc","Type":"ContainerDied","Data":"b84c077e8f618d802b5df30592ca46056cd9d7887eefeb3479e7e07e6f0d01e3"} Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.260211 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.429015 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2pqn\" (UniqueName: \"kubernetes.io/projected/f8711449-6cb3-4b5e-9b13-618cb27f35dc-kube-api-access-n2pqn\") pod \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\" (UID: \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\") " Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.429373 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8711449-6cb3-4b5e-9b13-618cb27f35dc-inventory\") pod \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\" (UID: \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\") " Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.429454 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f8711449-6cb3-4b5e-9b13-618cb27f35dc-ssh-key-openstack-edpm-ipam\") pod \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\" (UID: \"f8711449-6cb3-4b5e-9b13-618cb27f35dc\") " Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.434959 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8711449-6cb3-4b5e-9b13-618cb27f35dc-kube-api-access-n2pqn" (OuterVolumeSpecName: "kube-api-access-n2pqn") pod "f8711449-6cb3-4b5e-9b13-618cb27f35dc" (UID: "f8711449-6cb3-4b5e-9b13-618cb27f35dc"). InnerVolumeSpecName "kube-api-access-n2pqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.459917 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8711449-6cb3-4b5e-9b13-618cb27f35dc-inventory" (OuterVolumeSpecName: "inventory") pod "f8711449-6cb3-4b5e-9b13-618cb27f35dc" (UID: "f8711449-6cb3-4b5e-9b13-618cb27f35dc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.478055 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8711449-6cb3-4b5e-9b13-618cb27f35dc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f8711449-6cb3-4b5e-9b13-618cb27f35dc" (UID: "f8711449-6cb3-4b5e-9b13-618cb27f35dc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.532104 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8711449-6cb3-4b5e-9b13-618cb27f35dc-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.532132 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f8711449-6cb3-4b5e-9b13-618cb27f35dc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.532142 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2pqn\" (UniqueName: \"kubernetes.io/projected/f8711449-6cb3-4b5e-9b13-618cb27f35dc-kube-api-access-n2pqn\") on node \"crc\" DevicePath \"\"" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.755176 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" event={"ID":"f8711449-6cb3-4b5e-9b13-618cb27f35dc","Type":"ContainerDied","Data":"7808683fb29808f4a212366a73347ccc3eb9399202ff41ad82a9d5b5a06aec73"} Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.755221 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7808683fb29808f4a212366a73347ccc3eb9399202ff41ad82a9d5b5a06aec73" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.755270 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-z2rh5" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.844913 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq"] Jan 30 21:47:50 crc kubenswrapper[4914]: E0130 21:47:50.845463 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8711449-6cb3-4b5e-9b13-618cb27f35dc" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.845487 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8711449-6cb3-4b5e-9b13-618cb27f35dc" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.845857 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8711449-6cb3-4b5e-9b13-618cb27f35dc" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.846795 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.849528 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.849658 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.849928 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.851076 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.853883 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq"] Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.941330 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36499faa-28d4-4710-9190-125a3f1561a8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq\" (UID: \"36499faa-28d4-4710-9190-125a3f1561a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.941511 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x72tg\" (UniqueName: \"kubernetes.io/projected/36499faa-28d4-4710-9190-125a3f1561a8-kube-api-access-x72tg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq\" (UID: \"36499faa-28d4-4710-9190-125a3f1561a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" Jan 30 21:47:50 crc kubenswrapper[4914]: I0130 21:47:50.941598 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36499faa-28d4-4710-9190-125a3f1561a8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq\" (UID: \"36499faa-28d4-4710-9190-125a3f1561a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" Jan 30 21:47:51 crc kubenswrapper[4914]: I0130 21:47:51.043847 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36499faa-28d4-4710-9190-125a3f1561a8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq\" (UID: \"36499faa-28d4-4710-9190-125a3f1561a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" Jan 30 21:47:51 crc kubenswrapper[4914]: I0130 21:47:51.044233 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x72tg\" (UniqueName: \"kubernetes.io/projected/36499faa-28d4-4710-9190-125a3f1561a8-kube-api-access-x72tg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq\" (UID: \"36499faa-28d4-4710-9190-125a3f1561a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" Jan 30 21:47:51 crc kubenswrapper[4914]: I0130 21:47:51.044283 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36499faa-28d4-4710-9190-125a3f1561a8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq\" (UID: \"36499faa-28d4-4710-9190-125a3f1561a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" Jan 30 21:47:51 crc kubenswrapper[4914]: I0130 21:47:51.047516 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36499faa-28d4-4710-9190-125a3f1561a8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq\" (UID: \"36499faa-28d4-4710-9190-125a3f1561a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" Jan 30 21:47:51 crc kubenswrapper[4914]: I0130 21:47:51.057150 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36499faa-28d4-4710-9190-125a3f1561a8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq\" (UID: \"36499faa-28d4-4710-9190-125a3f1561a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" Jan 30 21:47:51 crc kubenswrapper[4914]: I0130 21:47:51.062390 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x72tg\" (UniqueName: \"kubernetes.io/projected/36499faa-28d4-4710-9190-125a3f1561a8-kube-api-access-x72tg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq\" (UID: \"36499faa-28d4-4710-9190-125a3f1561a8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" Jan 30 21:47:51 crc kubenswrapper[4914]: I0130 21:47:51.173215 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" Jan 30 21:47:51 crc kubenswrapper[4914]: I0130 21:47:51.788055 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq"] Jan 30 21:47:52 crc kubenswrapper[4914]: I0130 21:47:52.793960 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" event={"ID":"36499faa-28d4-4710-9190-125a3f1561a8","Type":"ContainerStarted","Data":"486b08c8e62a562954f8b9bb122ac29c3595325b676f854cbf04311f6dfc2d78"} Jan 30 21:47:53 crc kubenswrapper[4914]: I0130 21:47:53.803732 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" event={"ID":"36499faa-28d4-4710-9190-125a3f1561a8","Type":"ContainerStarted","Data":"2ac239172a1fe2c8872d5a0a6f97e1484586634af8eb75e3b6e61ad75e6d073f"} Jan 30 21:47:54 crc kubenswrapper[4914]: I0130 21:47:54.841728 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" podStartSLOduration=3.535545539 podStartE2EDuration="4.841697396s" podCreationTimestamp="2026-01-30 21:47:50 +0000 UTC" firstStartedPulling="2026-01-30 21:47:51.781977298 +0000 UTC m=+2005.220614069" lastFinishedPulling="2026-01-30 21:47:53.088129125 +0000 UTC m=+2006.526765926" observedRunningTime="2026-01-30 21:47:54.837329197 +0000 UTC m=+2008.275965958" watchObservedRunningTime="2026-01-30 21:47:54.841697396 +0000 UTC m=+2008.280334147" Jan 30 21:48:02 crc kubenswrapper[4914]: I0130 21:48:02.909131 4914 generic.go:334] "Generic (PLEG): container finished" podID="36499faa-28d4-4710-9190-125a3f1561a8" containerID="2ac239172a1fe2c8872d5a0a6f97e1484586634af8eb75e3b6e61ad75e6d073f" exitCode=0 Jan 30 21:48:02 crc kubenswrapper[4914]: I0130 21:48:02.909286 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" event={"ID":"36499faa-28d4-4710-9190-125a3f1561a8","Type":"ContainerDied","Data":"2ac239172a1fe2c8872d5a0a6f97e1484586634af8eb75e3b6e61ad75e6d073f"} Jan 30 21:48:04 crc kubenswrapper[4914]: I0130 21:48:04.425660 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" Jan 30 21:48:04 crc kubenswrapper[4914]: I0130 21:48:04.487548 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36499faa-28d4-4710-9190-125a3f1561a8-ssh-key-openstack-edpm-ipam\") pod \"36499faa-28d4-4710-9190-125a3f1561a8\" (UID: \"36499faa-28d4-4710-9190-125a3f1561a8\") " Jan 30 21:48:04 crc kubenswrapper[4914]: I0130 21:48:04.487597 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x72tg\" (UniqueName: \"kubernetes.io/projected/36499faa-28d4-4710-9190-125a3f1561a8-kube-api-access-x72tg\") pod \"36499faa-28d4-4710-9190-125a3f1561a8\" (UID: \"36499faa-28d4-4710-9190-125a3f1561a8\") " Jan 30 21:48:04 crc kubenswrapper[4914]: I0130 21:48:04.487689 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36499faa-28d4-4710-9190-125a3f1561a8-inventory\") pod \"36499faa-28d4-4710-9190-125a3f1561a8\" (UID: \"36499faa-28d4-4710-9190-125a3f1561a8\") " Jan 30 21:48:04 crc kubenswrapper[4914]: I0130 21:48:04.494143 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36499faa-28d4-4710-9190-125a3f1561a8-kube-api-access-x72tg" (OuterVolumeSpecName: "kube-api-access-x72tg") pod "36499faa-28d4-4710-9190-125a3f1561a8" (UID: "36499faa-28d4-4710-9190-125a3f1561a8"). InnerVolumeSpecName "kube-api-access-x72tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:48:04 crc kubenswrapper[4914]: I0130 21:48:04.517986 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36499faa-28d4-4710-9190-125a3f1561a8-inventory" (OuterVolumeSpecName: "inventory") pod "36499faa-28d4-4710-9190-125a3f1561a8" (UID: "36499faa-28d4-4710-9190-125a3f1561a8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:48:04 crc kubenswrapper[4914]: I0130 21:48:04.519485 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36499faa-28d4-4710-9190-125a3f1561a8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "36499faa-28d4-4710-9190-125a3f1561a8" (UID: "36499faa-28d4-4710-9190-125a3f1561a8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:48:04 crc kubenswrapper[4914]: I0130 21:48:04.590565 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36499faa-28d4-4710-9190-125a3f1561a8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:04 crc kubenswrapper[4914]: I0130 21:48:04.590607 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x72tg\" (UniqueName: \"kubernetes.io/projected/36499faa-28d4-4710-9190-125a3f1561a8-kube-api-access-x72tg\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:04 crc kubenswrapper[4914]: I0130 21:48:04.590621 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36499faa-28d4-4710-9190-125a3f1561a8-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:04 crc kubenswrapper[4914]: I0130 21:48:04.928675 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" event={"ID":"36499faa-28d4-4710-9190-125a3f1561a8","Type":"ContainerDied","Data":"486b08c8e62a562954f8b9bb122ac29c3595325b676f854cbf04311f6dfc2d78"} Jan 30 21:48:04 crc kubenswrapper[4914]: I0130 21:48:04.928759 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="486b08c8e62a562954f8b9bb122ac29c3595325b676f854cbf04311f6dfc2d78" Jan 30 21:48:04 crc kubenswrapper[4914]: I0130 21:48:04.928768 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.018773 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj"] Jan 30 21:48:05 crc kubenswrapper[4914]: E0130 21:48:05.019184 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36499faa-28d4-4710-9190-125a3f1561a8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.019201 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="36499faa-28d4-4710-9190-125a3f1561a8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.019392 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="36499faa-28d4-4710-9190-125a3f1561a8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.020304 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.023445 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.023955 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.024080 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.024236 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.024293 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.024236 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.024966 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.025043 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.060907 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj"] Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.204665 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.205507 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.205556 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.205579 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.205838 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.205935 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.206116 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.206156 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.206302 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.206384 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.206418 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.206450 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.206501 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.206619 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk5dj\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-kube-api-access-fk5dj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.308376 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.308499 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.308631 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.308724 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.308845 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.308896 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.308963 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.309063 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk5dj\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-kube-api-access-fk5dj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.309121 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.309232 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.309308 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.309373 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.309563 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.309664 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.313030 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.313190 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.313217 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.313923 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.315462 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.318284 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.322684 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.322993 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.323850 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.323880 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.324265 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.324520 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.331369 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.333363 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk5dj\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-kube-api-access-fk5dj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.341522 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.889108 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj"] Jan 30 21:48:05 crc kubenswrapper[4914]: I0130 21:48:05.948914 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" event={"ID":"d218c024-e43c-4d70-8e68-4d03d423d9ae","Type":"ContainerStarted","Data":"c4158b19942bd4889c2f12a6d3f199f4bd94ed1f11d49a8d454cf26b1d5c57ef"} Jan 30 21:48:08 crc kubenswrapper[4914]: I0130 21:48:08.979839 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" event={"ID":"d218c024-e43c-4d70-8e68-4d03d423d9ae","Type":"ContainerStarted","Data":"bd5da2090dd83e354af66cb6d1d5368385361b4696284049809ebdd665704ad4"} Jan 30 21:48:09 crc kubenswrapper[4914]: I0130 21:48:09.008838 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" podStartSLOduration=3.265587831 podStartE2EDuration="5.008815163s" podCreationTimestamp="2026-01-30 21:48:04 +0000 UTC" firstStartedPulling="2026-01-30 21:48:05.896715971 +0000 UTC m=+2019.335352732" lastFinishedPulling="2026-01-30 21:48:07.639943303 +0000 UTC m=+2021.078580064" observedRunningTime="2026-01-30 21:48:08.999678954 +0000 UTC m=+2022.438315735" watchObservedRunningTime="2026-01-30 21:48:09.008815163 +0000 UTC m=+2022.447451934" Jan 30 21:48:17 crc kubenswrapper[4914]: I0130 21:48:17.041580 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-f9bxt"] Jan 30 21:48:17 crc kubenswrapper[4914]: I0130 21:48:17.049231 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-f9bxt"] Jan 30 21:48:17 crc kubenswrapper[4914]: I0130 21:48:17.830554 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ce3d73e-f519-423b-81c0-d120a416b488" path="/var/lib/kubelet/pods/8ce3d73e-f519-423b-81c0-d120a416b488/volumes" Jan 30 21:48:26 crc kubenswrapper[4914]: I0130 21:48:26.034082 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-klrhc"] Jan 30 21:48:26 crc kubenswrapper[4914]: I0130 21:48:26.044990 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-klrhc"] Jan 30 21:48:27 crc kubenswrapper[4914]: I0130 21:48:27.833616 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="097daca3-adbd-4d0a-8606-876822a0cd4a" path="/var/lib/kubelet/pods/097daca3-adbd-4d0a-8606-876822a0cd4a/volumes" Jan 30 21:48:43 crc kubenswrapper[4914]: I0130 21:48:43.975276 4914 scope.go:117] "RemoveContainer" containerID="863abe8d1c7f1c569da5a45a039691412553cb8452748bb645bba991acab0053" Jan 30 21:48:44 crc kubenswrapper[4914]: I0130 21:48:44.012792 4914 scope.go:117] "RemoveContainer" containerID="19aa1cbe155106ce92b6c981189943363d426f118a6814e2ffba0bc2f62e79a7" Jan 30 21:48:44 crc kubenswrapper[4914]: I0130 21:48:44.349867 4914 generic.go:334] "Generic (PLEG): container finished" podID="d218c024-e43c-4d70-8e68-4d03d423d9ae" containerID="bd5da2090dd83e354af66cb6d1d5368385361b4696284049809ebdd665704ad4" exitCode=0 Jan 30 21:48:44 crc kubenswrapper[4914]: I0130 21:48:44.349916 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" event={"ID":"d218c024-e43c-4d70-8e68-4d03d423d9ae","Type":"ContainerDied","Data":"bd5da2090dd83e354af66cb6d1d5368385361b4696284049809ebdd665704ad4"} Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.865496 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.958226 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-telemetry-combined-ca-bundle\") pod \"d218c024-e43c-4d70-8e68-4d03d423d9ae\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.958269 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-bootstrap-combined-ca-bundle\") pod \"d218c024-e43c-4d70-8e68-4d03d423d9ae\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.958311 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-ovn-combined-ca-bundle\") pod \"d218c024-e43c-4d70-8e68-4d03d423d9ae\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.958466 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"d218c024-e43c-4d70-8e68-4d03d423d9ae\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.958507 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-nova-combined-ca-bundle\") pod \"d218c024-e43c-4d70-8e68-4d03d423d9ae\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.958533 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-libvirt-combined-ca-bundle\") pod \"d218c024-e43c-4d70-8e68-4d03d423d9ae\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.958564 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-neutron-metadata-combined-ca-bundle\") pod \"d218c024-e43c-4d70-8e68-4d03d423d9ae\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.958588 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"d218c024-e43c-4d70-8e68-4d03d423d9ae\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.958605 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-repo-setup-combined-ca-bundle\") pod \"d218c024-e43c-4d70-8e68-4d03d423d9ae\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.958633 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"d218c024-e43c-4d70-8e68-4d03d423d9ae\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.958652 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk5dj\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-kube-api-access-fk5dj\") pod \"d218c024-e43c-4d70-8e68-4d03d423d9ae\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.958675 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-inventory\") pod \"d218c024-e43c-4d70-8e68-4d03d423d9ae\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.958695 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-ssh-key-openstack-edpm-ipam\") pod \"d218c024-e43c-4d70-8e68-4d03d423d9ae\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.958782 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-ovn-default-certs-0\") pod \"d218c024-e43c-4d70-8e68-4d03d423d9ae\" (UID: \"d218c024-e43c-4d70-8e68-4d03d423d9ae\") " Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.982865 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d218c024-e43c-4d70-8e68-4d03d423d9ae" (UID: "d218c024-e43c-4d70-8e68-4d03d423d9ae"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.983070 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-kube-api-access-fk5dj" (OuterVolumeSpecName: "kube-api-access-fk5dj") pod "d218c024-e43c-4d70-8e68-4d03d423d9ae" (UID: "d218c024-e43c-4d70-8e68-4d03d423d9ae"). InnerVolumeSpecName "kube-api-access-fk5dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.983914 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d218c024-e43c-4d70-8e68-4d03d423d9ae" (UID: "d218c024-e43c-4d70-8e68-4d03d423d9ae"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.984268 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d218c024-e43c-4d70-8e68-4d03d423d9ae" (UID: "d218c024-e43c-4d70-8e68-4d03d423d9ae"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.991219 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d218c024-e43c-4d70-8e68-4d03d423d9ae" (UID: "d218c024-e43c-4d70-8e68-4d03d423d9ae"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.991242 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "d218c024-e43c-4d70-8e68-4d03d423d9ae" (UID: "d218c024-e43c-4d70-8e68-4d03d423d9ae"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.991348 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d218c024-e43c-4d70-8e68-4d03d423d9ae" (UID: "d218c024-e43c-4d70-8e68-4d03d423d9ae"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.991363 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "d218c024-e43c-4d70-8e68-4d03d423d9ae" (UID: "d218c024-e43c-4d70-8e68-4d03d423d9ae"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.998921 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "d218c024-e43c-4d70-8e68-4d03d423d9ae" (UID: "d218c024-e43c-4d70-8e68-4d03d423d9ae"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:48:45 crc kubenswrapper[4914]: I0130 21:48:45.999093 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "d218c024-e43c-4d70-8e68-4d03d423d9ae" (UID: "d218c024-e43c-4d70-8e68-4d03d423d9ae"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.020907 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d218c024-e43c-4d70-8e68-4d03d423d9ae" (UID: "d218c024-e43c-4d70-8e68-4d03d423d9ae"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.031808 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d218c024-e43c-4d70-8e68-4d03d423d9ae" (UID: "d218c024-e43c-4d70-8e68-4d03d423d9ae"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.064279 4914 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.064479 4914 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.064586 4914 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.064668 4914 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.064777 4914 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.064888 4914 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.064976 4914 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.065058 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk5dj\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-kube-api-access-fk5dj\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.065141 4914 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d218c024-e43c-4d70-8e68-4d03d423d9ae-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.065226 4914 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.065298 4914 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.065368 4914 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.087844 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d218c024-e43c-4d70-8e68-4d03d423d9ae" (UID: "d218c024-e43c-4d70-8e68-4d03d423d9ae"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.093564 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-inventory" (OuterVolumeSpecName: "inventory") pod "d218c024-e43c-4d70-8e68-4d03d423d9ae" (UID: "d218c024-e43c-4d70-8e68-4d03d423d9ae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.166884 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.167089 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d218c024-e43c-4d70-8e68-4d03d423d9ae-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.366838 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" event={"ID":"d218c024-e43c-4d70-8e68-4d03d423d9ae","Type":"ContainerDied","Data":"c4158b19942bd4889c2f12a6d3f199f4bd94ed1f11d49a8d454cf26b1d5c57ef"} Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.367186 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4158b19942bd4889c2f12a6d3f199f4bd94ed1f11d49a8d454cf26b1d5c57ef" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.366908 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.488859 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89"] Jan 30 21:48:46 crc kubenswrapper[4914]: E0130 21:48:46.489254 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d218c024-e43c-4d70-8e68-4d03d423d9ae" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.489271 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="d218c024-e43c-4d70-8e68-4d03d423d9ae" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.489456 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="d218c024-e43c-4d70-8e68-4d03d423d9ae" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.490132 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.492743 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.493139 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.493416 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.493677 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.494077 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.505224 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89"] Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.576493 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.576576 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.576601 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.576620 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gr5w\" (UniqueName: \"kubernetes.io/projected/1d553e8c-3252-4e0b-87f6-8e649c83f3de-kube-api-access-5gr5w\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.576672 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.678817 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.678971 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.679039 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.679071 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.679099 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gr5w\" (UniqueName: \"kubernetes.io/projected/1d553e8c-3252-4e0b-87f6-8e649c83f3de-kube-api-access-5gr5w\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.680176 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.683129 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.683438 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.684354 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.696961 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gr5w\" (UniqueName: \"kubernetes.io/projected/1d553e8c-3252-4e0b-87f6-8e649c83f3de-kube-api-access-5gr5w\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7nn89\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:46 crc kubenswrapper[4914]: I0130 21:48:46.851185 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:48:47 crc kubenswrapper[4914]: I0130 21:48:47.580456 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89"] Jan 30 21:48:48 crc kubenswrapper[4914]: I0130 21:48:48.384012 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" event={"ID":"1d553e8c-3252-4e0b-87f6-8e649c83f3de","Type":"ContainerStarted","Data":"2b8629a3826f613db4efd696897004a5ebe65e5fd1bb006fb9c718aa7293957d"} Jan 30 21:48:48 crc kubenswrapper[4914]: I0130 21:48:48.384384 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" event={"ID":"1d553e8c-3252-4e0b-87f6-8e649c83f3de","Type":"ContainerStarted","Data":"3bb19d08e6291df71e6589a6c0a354864157cec4f7c805da31aa74ca152eaf49"} Jan 30 21:48:48 crc kubenswrapper[4914]: I0130 21:48:48.404200 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" podStartSLOduration=1.897726979 podStartE2EDuration="2.404182692s" podCreationTimestamp="2026-01-30 21:48:46 +0000 UTC" firstStartedPulling="2026-01-30 21:48:47.585402836 +0000 UTC m=+2061.024039597" lastFinishedPulling="2026-01-30 21:48:48.091858539 +0000 UTC m=+2061.530495310" observedRunningTime="2026-01-30 21:48:48.397474455 +0000 UTC m=+2061.836111206" watchObservedRunningTime="2026-01-30 21:48:48.404182692 +0000 UTC m=+2061.842819453" Jan 30 21:49:45 crc kubenswrapper[4914]: I0130 21:49:45.161491 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9c7br"] Jan 30 21:49:45 crc kubenswrapper[4914]: I0130 21:49:45.164133 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:49:45 crc kubenswrapper[4914]: I0130 21:49:45.176893 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9c7br"] Jan 30 21:49:45 crc kubenswrapper[4914]: I0130 21:49:45.267245 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-catalog-content\") pod \"redhat-operators-9c7br\" (UID: \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\") " pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:49:45 crc kubenswrapper[4914]: I0130 21:49:45.267354 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-utilities\") pod \"redhat-operators-9c7br\" (UID: \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\") " pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:49:45 crc kubenswrapper[4914]: I0130 21:49:45.267399 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g444b\" (UniqueName: \"kubernetes.io/projected/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-kube-api-access-g444b\") pod \"redhat-operators-9c7br\" (UID: \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\") " pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:49:45 crc kubenswrapper[4914]: I0130 21:49:45.369410 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-catalog-content\") pod \"redhat-operators-9c7br\" (UID: \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\") " pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:49:45 crc kubenswrapper[4914]: I0130 21:49:45.369501 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-utilities\") pod \"redhat-operators-9c7br\" (UID: \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\") " pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:49:45 crc kubenswrapper[4914]: I0130 21:49:45.369542 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g444b\" (UniqueName: \"kubernetes.io/projected/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-kube-api-access-g444b\") pod \"redhat-operators-9c7br\" (UID: \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\") " pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:49:45 crc kubenswrapper[4914]: I0130 21:49:45.370102 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-utilities\") pod \"redhat-operators-9c7br\" (UID: \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\") " pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:49:45 crc kubenswrapper[4914]: I0130 21:49:45.370118 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-catalog-content\") pod \"redhat-operators-9c7br\" (UID: \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\") " pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:49:45 crc kubenswrapper[4914]: I0130 21:49:45.398471 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g444b\" (UniqueName: \"kubernetes.io/projected/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-kube-api-access-g444b\") pod \"redhat-operators-9c7br\" (UID: \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\") " pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:49:45 crc kubenswrapper[4914]: I0130 21:49:45.536802 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:49:46 crc kubenswrapper[4914]: I0130 21:49:46.152789 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9c7br"] Jan 30 21:49:46 crc kubenswrapper[4914]: I0130 21:49:46.974930 4914 generic.go:334] "Generic (PLEG): container finished" podID="d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" containerID="7a50b4d907fc8cfb932b52d542f5cc1eb467ed017fb5ab454936d501037e9fc2" exitCode=0 Jan 30 21:49:46 crc kubenswrapper[4914]: I0130 21:49:46.975101 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9c7br" event={"ID":"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e","Type":"ContainerDied","Data":"7a50b4d907fc8cfb932b52d542f5cc1eb467ed017fb5ab454936d501037e9fc2"} Jan 30 21:49:46 crc kubenswrapper[4914]: I0130 21:49:46.975232 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9c7br" event={"ID":"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e","Type":"ContainerStarted","Data":"c4f6e51f39b95b9a9a177a4a9d9b10f0aa6d8b552df90d1319d4a78cff8912b6"} Jan 30 21:49:49 crc kubenswrapper[4914]: I0130 21:49:49.005492 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9c7br" event={"ID":"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e","Type":"ContainerStarted","Data":"77087235eb5651fd7fb3c718c2145f363ba00f11a6991b09012090107f97df61"} Jan 30 21:49:52 crc kubenswrapper[4914]: I0130 21:49:52.041320 4914 generic.go:334] "Generic (PLEG): container finished" podID="1d553e8c-3252-4e0b-87f6-8e649c83f3de" containerID="2b8629a3826f613db4efd696897004a5ebe65e5fd1bb006fb9c718aa7293957d" exitCode=0 Jan 30 21:49:52 crc kubenswrapper[4914]: I0130 21:49:52.041409 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" event={"ID":"1d553e8c-3252-4e0b-87f6-8e649c83f3de","Type":"ContainerDied","Data":"2b8629a3826f613db4efd696897004a5ebe65e5fd1bb006fb9c718aa7293957d"} Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.630729 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.693504 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gr5w\" (UniqueName: \"kubernetes.io/projected/1d553e8c-3252-4e0b-87f6-8e649c83f3de-kube-api-access-5gr5w\") pod \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.693606 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-inventory\") pod \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.693662 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ssh-key-openstack-edpm-ipam\") pod \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.693680 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ovn-combined-ca-bundle\") pod \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.693809 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ovncontroller-config-0\") pod \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\" (UID: \"1d553e8c-3252-4e0b-87f6-8e649c83f3de\") " Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.699683 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1d553e8c-3252-4e0b-87f6-8e649c83f3de" (UID: "1d553e8c-3252-4e0b-87f6-8e649c83f3de"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.715280 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d553e8c-3252-4e0b-87f6-8e649c83f3de-kube-api-access-5gr5w" (OuterVolumeSpecName: "kube-api-access-5gr5w") pod "1d553e8c-3252-4e0b-87f6-8e649c83f3de" (UID: "1d553e8c-3252-4e0b-87f6-8e649c83f3de"). InnerVolumeSpecName "kube-api-access-5gr5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.729000 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-inventory" (OuterVolumeSpecName: "inventory") pod "1d553e8c-3252-4e0b-87f6-8e649c83f3de" (UID: "1d553e8c-3252-4e0b-87f6-8e649c83f3de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.744208 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "1d553e8c-3252-4e0b-87f6-8e649c83f3de" (UID: "1d553e8c-3252-4e0b-87f6-8e649c83f3de"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.747168 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1d553e8c-3252-4e0b-87f6-8e649c83f3de" (UID: "1d553e8c-3252-4e0b-87f6-8e649c83f3de"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.796140 4914 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.796172 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gr5w\" (UniqueName: \"kubernetes.io/projected/1d553e8c-3252-4e0b-87f6-8e649c83f3de-kube-api-access-5gr5w\") on node \"crc\" DevicePath \"\"" Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.796183 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.796194 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:49:53 crc kubenswrapper[4914]: I0130 21:49:53.796206 4914 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d553e8c-3252-4e0b-87f6-8e649c83f3de-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.060263 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" event={"ID":"1d553e8c-3252-4e0b-87f6-8e649c83f3de","Type":"ContainerDied","Data":"3bb19d08e6291df71e6589a6c0a354864157cec4f7c805da31aa74ca152eaf49"} Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.060616 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bb19d08e6291df71e6589a6c0a354864157cec4f7c805da31aa74ca152eaf49" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.060307 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7nn89" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.169434 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz"] Jan 30 21:49:54 crc kubenswrapper[4914]: E0130 21:49:54.180300 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d553e8c-3252-4e0b-87f6-8e649c83f3de" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.180320 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d553e8c-3252-4e0b-87f6-8e649c83f3de" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.180527 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d553e8c-3252-4e0b-87f6-8e649c83f3de" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.181321 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.183387 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.183458 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.184183 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz"] Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.190764 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.190916 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.191106 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.190775 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.307163 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.307223 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.307445 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.307609 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.307640 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.307837 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mrm9\" (UniqueName: \"kubernetes.io/projected/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-kube-api-access-4mrm9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.409783 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.409855 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.409984 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.410036 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.410060 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.410124 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mrm9\" (UniqueName: \"kubernetes.io/projected/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-kube-api-access-4mrm9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.414857 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.415826 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.417292 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.418323 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.421941 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.429244 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mrm9\" (UniqueName: \"kubernetes.io/projected/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-kube-api-access-4mrm9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:54 crc kubenswrapper[4914]: I0130 21:49:54.497471 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:49:55 crc kubenswrapper[4914]: I0130 21:49:55.050431 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz"] Jan 30 21:49:55 crc kubenswrapper[4914]: W0130 21:49:55.054506 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f19f9c6_b274_40bd_9693_b26eb56bbe0a.slice/crio-49a7980ff403c4243a7895931b2110b121d340bd0141af825431d0de1890cbb4 WatchSource:0}: Error finding container 49a7980ff403c4243a7895931b2110b121d340bd0141af825431d0de1890cbb4: Status 404 returned error can't find the container with id 49a7980ff403c4243a7895931b2110b121d340bd0141af825431d0de1890cbb4 Jan 30 21:49:55 crc kubenswrapper[4914]: I0130 21:49:55.078777 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" event={"ID":"7f19f9c6-b274-40bd-9693-b26eb56bbe0a","Type":"ContainerStarted","Data":"49a7980ff403c4243a7895931b2110b121d340bd0141af825431d0de1890cbb4"} Jan 30 21:49:55 crc kubenswrapper[4914]: I0130 21:49:55.081656 4914 generic.go:334] "Generic (PLEG): container finished" podID="d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" containerID="77087235eb5651fd7fb3c718c2145f363ba00f11a6991b09012090107f97df61" exitCode=0 Jan 30 21:49:55 crc kubenswrapper[4914]: I0130 21:49:55.081697 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9c7br" event={"ID":"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e","Type":"ContainerDied","Data":"77087235eb5651fd7fb3c718c2145f363ba00f11a6991b09012090107f97df61"} Jan 30 21:49:56 crc kubenswrapper[4914]: I0130 21:49:56.095094 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9c7br" event={"ID":"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e","Type":"ContainerStarted","Data":"1da67dc9fe3dd991f84526b61b7878961c17a02e291357802a5c64f6870314ef"} Jan 30 21:49:56 crc kubenswrapper[4914]: I0130 21:49:56.098073 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" event={"ID":"7f19f9c6-b274-40bd-9693-b26eb56bbe0a","Type":"ContainerStarted","Data":"9a3f6a49b0f9f84e28504a38be2164fa16db45d7dae5e85bba830441073f1911"} Jan 30 21:49:56 crc kubenswrapper[4914]: I0130 21:49:56.126753 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9c7br" podStartSLOduration=2.647905456 podStartE2EDuration="11.126725638s" podCreationTimestamp="2026-01-30 21:49:45 +0000 UTC" firstStartedPulling="2026-01-30 21:49:46.978169124 +0000 UTC m=+2120.416805885" lastFinishedPulling="2026-01-30 21:49:55.456989306 +0000 UTC m=+2128.895626067" observedRunningTime="2026-01-30 21:49:56.113872897 +0000 UTC m=+2129.552509658" watchObservedRunningTime="2026-01-30 21:49:56.126725638 +0000 UTC m=+2129.565362409" Jan 30 21:49:56 crc kubenswrapper[4914]: I0130 21:49:56.142103 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" podStartSLOduration=1.654922541 podStartE2EDuration="2.142083982s" podCreationTimestamp="2026-01-30 21:49:54 +0000 UTC" firstStartedPulling="2026-01-30 21:49:55.057579507 +0000 UTC m=+2128.496216268" lastFinishedPulling="2026-01-30 21:49:55.544740948 +0000 UTC m=+2128.983377709" observedRunningTime="2026-01-30 21:49:56.138987245 +0000 UTC m=+2129.577624006" watchObservedRunningTime="2026-01-30 21:49:56.142083982 +0000 UTC m=+2129.580720743" Jan 30 21:49:56 crc kubenswrapper[4914]: I0130 21:49:56.983540 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:49:56 crc kubenswrapper[4914]: I0130 21:49:56.983891 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:50:05 crc kubenswrapper[4914]: I0130 21:50:05.537337 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:50:05 crc kubenswrapper[4914]: I0130 21:50:05.537987 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:50:05 crc kubenswrapper[4914]: I0130 21:50:05.583871 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:50:06 crc kubenswrapper[4914]: I0130 21:50:06.279475 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:50:06 crc kubenswrapper[4914]: I0130 21:50:06.341053 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9c7br"] Jan 30 21:50:08 crc kubenswrapper[4914]: I0130 21:50:08.236733 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9c7br" podUID="d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" containerName="registry-server" containerID="cri-o://1da67dc9fe3dd991f84526b61b7878961c17a02e291357802a5c64f6870314ef" gracePeriod=2 Jan 30 21:50:08 crc kubenswrapper[4914]: I0130 21:50:08.798297 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:50:08 crc kubenswrapper[4914]: I0130 21:50:08.863077 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-utilities\") pod \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\" (UID: \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\") " Jan 30 21:50:08 crc kubenswrapper[4914]: I0130 21:50:08.863390 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-catalog-content\") pod \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\" (UID: \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\") " Jan 30 21:50:08 crc kubenswrapper[4914]: I0130 21:50:08.863548 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g444b\" (UniqueName: \"kubernetes.io/projected/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-kube-api-access-g444b\") pod \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\" (UID: \"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e\") " Jan 30 21:50:08 crc kubenswrapper[4914]: I0130 21:50:08.864721 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-utilities" (OuterVolumeSpecName: "utilities") pod "d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" (UID: "d54a3e13-66e7-45d4-ad95-3b94e8f1e45e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:50:08 crc kubenswrapper[4914]: I0130 21:50:08.869556 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-kube-api-access-g444b" (OuterVolumeSpecName: "kube-api-access-g444b") pod "d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" (UID: "d54a3e13-66e7-45d4-ad95-3b94e8f1e45e"). InnerVolumeSpecName "kube-api-access-g444b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:50:08 crc kubenswrapper[4914]: I0130 21:50:08.966538 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g444b\" (UniqueName: \"kubernetes.io/projected/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-kube-api-access-g444b\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:08 crc kubenswrapper[4914]: I0130 21:50:08.966580 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:08 crc kubenswrapper[4914]: I0130 21:50:08.988659 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" (UID: "d54a3e13-66e7-45d4-ad95-3b94e8f1e45e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.069539 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.250313 4914 generic.go:334] "Generic (PLEG): container finished" podID="d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" containerID="1da67dc9fe3dd991f84526b61b7878961c17a02e291357802a5c64f6870314ef" exitCode=0 Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.250390 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9c7br" event={"ID":"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e","Type":"ContainerDied","Data":"1da67dc9fe3dd991f84526b61b7878961c17a02e291357802a5c64f6870314ef"} Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.250425 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9c7br" Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.250481 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9c7br" event={"ID":"d54a3e13-66e7-45d4-ad95-3b94e8f1e45e","Type":"ContainerDied","Data":"c4f6e51f39b95b9a9a177a4a9d9b10f0aa6d8b552df90d1319d4a78cff8912b6"} Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.250515 4914 scope.go:117] "RemoveContainer" containerID="1da67dc9fe3dd991f84526b61b7878961c17a02e291357802a5c64f6870314ef" Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.274821 4914 scope.go:117] "RemoveContainer" containerID="77087235eb5651fd7fb3c718c2145f363ba00f11a6991b09012090107f97df61" Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.305843 4914 scope.go:117] "RemoveContainer" containerID="7a50b4d907fc8cfb932b52d542f5cc1eb467ed017fb5ab454936d501037e9fc2" Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.317769 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9c7br"] Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.330487 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9c7br"] Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.349647 4914 scope.go:117] "RemoveContainer" containerID="1da67dc9fe3dd991f84526b61b7878961c17a02e291357802a5c64f6870314ef" Jan 30 21:50:09 crc kubenswrapper[4914]: E0130 21:50:09.350132 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1da67dc9fe3dd991f84526b61b7878961c17a02e291357802a5c64f6870314ef\": container with ID starting with 1da67dc9fe3dd991f84526b61b7878961c17a02e291357802a5c64f6870314ef not found: ID does not exist" containerID="1da67dc9fe3dd991f84526b61b7878961c17a02e291357802a5c64f6870314ef" Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.350185 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da67dc9fe3dd991f84526b61b7878961c17a02e291357802a5c64f6870314ef"} err="failed to get container status \"1da67dc9fe3dd991f84526b61b7878961c17a02e291357802a5c64f6870314ef\": rpc error: code = NotFound desc = could not find container \"1da67dc9fe3dd991f84526b61b7878961c17a02e291357802a5c64f6870314ef\": container with ID starting with 1da67dc9fe3dd991f84526b61b7878961c17a02e291357802a5c64f6870314ef not found: ID does not exist" Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.350217 4914 scope.go:117] "RemoveContainer" containerID="77087235eb5651fd7fb3c718c2145f363ba00f11a6991b09012090107f97df61" Jan 30 21:50:09 crc kubenswrapper[4914]: E0130 21:50:09.350683 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77087235eb5651fd7fb3c718c2145f363ba00f11a6991b09012090107f97df61\": container with ID starting with 77087235eb5651fd7fb3c718c2145f363ba00f11a6991b09012090107f97df61 not found: ID does not exist" containerID="77087235eb5651fd7fb3c718c2145f363ba00f11a6991b09012090107f97df61" Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.350737 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77087235eb5651fd7fb3c718c2145f363ba00f11a6991b09012090107f97df61"} err="failed to get container status \"77087235eb5651fd7fb3c718c2145f363ba00f11a6991b09012090107f97df61\": rpc error: code = NotFound desc = could not find container \"77087235eb5651fd7fb3c718c2145f363ba00f11a6991b09012090107f97df61\": container with ID starting with 77087235eb5651fd7fb3c718c2145f363ba00f11a6991b09012090107f97df61 not found: ID does not exist" Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.350759 4914 scope.go:117] "RemoveContainer" containerID="7a50b4d907fc8cfb932b52d542f5cc1eb467ed017fb5ab454936d501037e9fc2" Jan 30 21:50:09 crc kubenswrapper[4914]: E0130 21:50:09.351240 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a50b4d907fc8cfb932b52d542f5cc1eb467ed017fb5ab454936d501037e9fc2\": container with ID starting with 7a50b4d907fc8cfb932b52d542f5cc1eb467ed017fb5ab454936d501037e9fc2 not found: ID does not exist" containerID="7a50b4d907fc8cfb932b52d542f5cc1eb467ed017fb5ab454936d501037e9fc2" Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.351279 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a50b4d907fc8cfb932b52d542f5cc1eb467ed017fb5ab454936d501037e9fc2"} err="failed to get container status \"7a50b4d907fc8cfb932b52d542f5cc1eb467ed017fb5ab454936d501037e9fc2\": rpc error: code = NotFound desc = could not find container \"7a50b4d907fc8cfb932b52d542f5cc1eb467ed017fb5ab454936d501037e9fc2\": container with ID starting with 7a50b4d907fc8cfb932b52d542f5cc1eb467ed017fb5ab454936d501037e9fc2 not found: ID does not exist" Jan 30 21:50:09 crc kubenswrapper[4914]: I0130 21:50:09.832067 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" path="/var/lib/kubelet/pods/d54a3e13-66e7-45d4-ad95-3b94e8f1e45e/volumes" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.665630 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jnqjn"] Jan 30 21:50:13 crc kubenswrapper[4914]: E0130 21:50:13.666822 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" containerName="registry-server" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.666840 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" containerName="registry-server" Jan 30 21:50:13 crc kubenswrapper[4914]: E0130 21:50:13.666879 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" containerName="extract-content" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.666887 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" containerName="extract-content" Jan 30 21:50:13 crc kubenswrapper[4914]: E0130 21:50:13.666903 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" containerName="extract-utilities" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.666913 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" containerName="extract-utilities" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.667159 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54a3e13-66e7-45d4-ad95-3b94e8f1e45e" containerName="registry-server" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.669176 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.718834 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jnqjn"] Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.783317 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxlk5\" (UniqueName: \"kubernetes.io/projected/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-kube-api-access-nxlk5\") pod \"community-operators-jnqjn\" (UID: \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\") " pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.783440 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-catalog-content\") pod \"community-operators-jnqjn\" (UID: \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\") " pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.783512 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-utilities\") pod \"community-operators-jnqjn\" (UID: \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\") " pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.885325 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-catalog-content\") pod \"community-operators-jnqjn\" (UID: \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\") " pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.885605 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-utilities\") pod \"community-operators-jnqjn\" (UID: \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\") " pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.885670 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxlk5\" (UniqueName: \"kubernetes.io/projected/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-kube-api-access-nxlk5\") pod \"community-operators-jnqjn\" (UID: \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\") " pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.885814 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-catalog-content\") pod \"community-operators-jnqjn\" (UID: \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\") " pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.886035 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-utilities\") pod \"community-operators-jnqjn\" (UID: \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\") " pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.911361 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxlk5\" (UniqueName: \"kubernetes.io/projected/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-kube-api-access-nxlk5\") pod \"community-operators-jnqjn\" (UID: \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\") " pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:13 crc kubenswrapper[4914]: I0130 21:50:13.997258 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:14 crc kubenswrapper[4914]: I0130 21:50:14.521911 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jnqjn"] Jan 30 21:50:15 crc kubenswrapper[4914]: I0130 21:50:15.311888 4914 generic.go:334] "Generic (PLEG): container finished" podID="fdd55fd4-f0d1-4cc4-9f83-3b309547f888" containerID="861cd3b1abcf47c47587ea9713afca390bf5ae45aa7b9a4176e29c613c41906a" exitCode=0 Jan 30 21:50:15 crc kubenswrapper[4914]: I0130 21:50:15.311967 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jnqjn" event={"ID":"fdd55fd4-f0d1-4cc4-9f83-3b309547f888","Type":"ContainerDied","Data":"861cd3b1abcf47c47587ea9713afca390bf5ae45aa7b9a4176e29c613c41906a"} Jan 30 21:50:15 crc kubenswrapper[4914]: I0130 21:50:15.312212 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jnqjn" event={"ID":"fdd55fd4-f0d1-4cc4-9f83-3b309547f888","Type":"ContainerStarted","Data":"faf8416b4dd595393b17f83e51e11d380709046619e721fa6b011a28ae7f7be7"} Jan 30 21:50:17 crc kubenswrapper[4914]: I0130 21:50:17.333823 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jnqjn" event={"ID":"fdd55fd4-f0d1-4cc4-9f83-3b309547f888","Type":"ContainerStarted","Data":"222aac5095b942281548d2e07b8a96464c69703fc17e0a7b6461f4e18f0b56c8"} Jan 30 21:50:18 crc kubenswrapper[4914]: I0130 21:50:18.347245 4914 generic.go:334] "Generic (PLEG): container finished" podID="fdd55fd4-f0d1-4cc4-9f83-3b309547f888" containerID="222aac5095b942281548d2e07b8a96464c69703fc17e0a7b6461f4e18f0b56c8" exitCode=0 Jan 30 21:50:18 crc kubenswrapper[4914]: I0130 21:50:18.347355 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jnqjn" event={"ID":"fdd55fd4-f0d1-4cc4-9f83-3b309547f888","Type":"ContainerDied","Data":"222aac5095b942281548d2e07b8a96464c69703fc17e0a7b6461f4e18f0b56c8"} Jan 30 21:50:19 crc kubenswrapper[4914]: I0130 21:50:19.358848 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jnqjn" event={"ID":"fdd55fd4-f0d1-4cc4-9f83-3b309547f888","Type":"ContainerStarted","Data":"2c8ac09d252b819b2b1c15a98cc4fdf1dd8d9134023f24a1eb1ae4b5e2c27d86"} Jan 30 21:50:19 crc kubenswrapper[4914]: I0130 21:50:19.382902 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jnqjn" podStartSLOduration=2.572699162 podStartE2EDuration="6.382884734s" podCreationTimestamp="2026-01-30 21:50:13 +0000 UTC" firstStartedPulling="2026-01-30 21:50:15.313467985 +0000 UTC m=+2148.752104746" lastFinishedPulling="2026-01-30 21:50:19.123653547 +0000 UTC m=+2152.562290318" observedRunningTime="2026-01-30 21:50:19.373879379 +0000 UTC m=+2152.812516160" watchObservedRunningTime="2026-01-30 21:50:19.382884734 +0000 UTC m=+2152.821521485" Jan 30 21:50:23 crc kubenswrapper[4914]: I0130 21:50:23.997352 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:23 crc kubenswrapper[4914]: I0130 21:50:23.997949 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:24 crc kubenswrapper[4914]: I0130 21:50:24.043009 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:24 crc kubenswrapper[4914]: I0130 21:50:24.463325 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:25 crc kubenswrapper[4914]: I0130 21:50:25.206780 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jnqjn"] Jan 30 21:50:26 crc kubenswrapper[4914]: I0130 21:50:26.420502 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jnqjn" podUID="fdd55fd4-f0d1-4cc4-9f83-3b309547f888" containerName="registry-server" containerID="cri-o://2c8ac09d252b819b2b1c15a98cc4fdf1dd8d9134023f24a1eb1ae4b5e2c27d86" gracePeriod=2 Jan 30 21:50:26 crc kubenswrapper[4914]: I0130 21:50:26.983418 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:50:26 crc kubenswrapper[4914]: I0130 21:50:26.983869 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.029361 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.165518 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-utilities\") pod \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\" (UID: \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\") " Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.165878 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxlk5\" (UniqueName: \"kubernetes.io/projected/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-kube-api-access-nxlk5\") pod \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\" (UID: \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\") " Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.166059 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-catalog-content\") pod \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\" (UID: \"fdd55fd4-f0d1-4cc4-9f83-3b309547f888\") " Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.166758 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-utilities" (OuterVolumeSpecName: "utilities") pod "fdd55fd4-f0d1-4cc4-9f83-3b309547f888" (UID: "fdd55fd4-f0d1-4cc4-9f83-3b309547f888"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.170451 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-kube-api-access-nxlk5" (OuterVolumeSpecName: "kube-api-access-nxlk5") pod "fdd55fd4-f0d1-4cc4-9f83-3b309547f888" (UID: "fdd55fd4-f0d1-4cc4-9f83-3b309547f888"). InnerVolumeSpecName "kube-api-access-nxlk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.269002 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.269041 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxlk5\" (UniqueName: \"kubernetes.io/projected/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-kube-api-access-nxlk5\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.368783 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdd55fd4-f0d1-4cc4-9f83-3b309547f888" (UID: "fdd55fd4-f0d1-4cc4-9f83-3b309547f888"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.371512 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd55fd4-f0d1-4cc4-9f83-3b309547f888-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.442885 4914 generic.go:334] "Generic (PLEG): container finished" podID="fdd55fd4-f0d1-4cc4-9f83-3b309547f888" containerID="2c8ac09d252b819b2b1c15a98cc4fdf1dd8d9134023f24a1eb1ae4b5e2c27d86" exitCode=0 Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.442924 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jnqjn" event={"ID":"fdd55fd4-f0d1-4cc4-9f83-3b309547f888","Type":"ContainerDied","Data":"2c8ac09d252b819b2b1c15a98cc4fdf1dd8d9134023f24a1eb1ae4b5e2c27d86"} Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.442950 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jnqjn" event={"ID":"fdd55fd4-f0d1-4cc4-9f83-3b309547f888","Type":"ContainerDied","Data":"faf8416b4dd595393b17f83e51e11d380709046619e721fa6b011a28ae7f7be7"} Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.442962 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jnqjn" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.442968 4914 scope.go:117] "RemoveContainer" containerID="2c8ac09d252b819b2b1c15a98cc4fdf1dd8d9134023f24a1eb1ae4b5e2c27d86" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.463981 4914 scope.go:117] "RemoveContainer" containerID="222aac5095b942281548d2e07b8a96464c69703fc17e0a7b6461f4e18f0b56c8" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.481094 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jnqjn"] Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.492066 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jnqjn"] Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.495439 4914 scope.go:117] "RemoveContainer" containerID="861cd3b1abcf47c47587ea9713afca390bf5ae45aa7b9a4176e29c613c41906a" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.544826 4914 scope.go:117] "RemoveContainer" containerID="2c8ac09d252b819b2b1c15a98cc4fdf1dd8d9134023f24a1eb1ae4b5e2c27d86" Jan 30 21:50:27 crc kubenswrapper[4914]: E0130 21:50:27.545448 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c8ac09d252b819b2b1c15a98cc4fdf1dd8d9134023f24a1eb1ae4b5e2c27d86\": container with ID starting with 2c8ac09d252b819b2b1c15a98cc4fdf1dd8d9134023f24a1eb1ae4b5e2c27d86 not found: ID does not exist" containerID="2c8ac09d252b819b2b1c15a98cc4fdf1dd8d9134023f24a1eb1ae4b5e2c27d86" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.545578 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c8ac09d252b819b2b1c15a98cc4fdf1dd8d9134023f24a1eb1ae4b5e2c27d86"} err="failed to get container status \"2c8ac09d252b819b2b1c15a98cc4fdf1dd8d9134023f24a1eb1ae4b5e2c27d86\": rpc error: code = NotFound desc = could not find container \"2c8ac09d252b819b2b1c15a98cc4fdf1dd8d9134023f24a1eb1ae4b5e2c27d86\": container with ID starting with 2c8ac09d252b819b2b1c15a98cc4fdf1dd8d9134023f24a1eb1ae4b5e2c27d86 not found: ID does not exist" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.545826 4914 scope.go:117] "RemoveContainer" containerID="222aac5095b942281548d2e07b8a96464c69703fc17e0a7b6461f4e18f0b56c8" Jan 30 21:50:27 crc kubenswrapper[4914]: E0130 21:50:27.546236 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"222aac5095b942281548d2e07b8a96464c69703fc17e0a7b6461f4e18f0b56c8\": container with ID starting with 222aac5095b942281548d2e07b8a96464c69703fc17e0a7b6461f4e18f0b56c8 not found: ID does not exist" containerID="222aac5095b942281548d2e07b8a96464c69703fc17e0a7b6461f4e18f0b56c8" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.546260 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222aac5095b942281548d2e07b8a96464c69703fc17e0a7b6461f4e18f0b56c8"} err="failed to get container status \"222aac5095b942281548d2e07b8a96464c69703fc17e0a7b6461f4e18f0b56c8\": rpc error: code = NotFound desc = could not find container \"222aac5095b942281548d2e07b8a96464c69703fc17e0a7b6461f4e18f0b56c8\": container with ID starting with 222aac5095b942281548d2e07b8a96464c69703fc17e0a7b6461f4e18f0b56c8 not found: ID does not exist" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.546275 4914 scope.go:117] "RemoveContainer" containerID="861cd3b1abcf47c47587ea9713afca390bf5ae45aa7b9a4176e29c613c41906a" Jan 30 21:50:27 crc kubenswrapper[4914]: E0130 21:50:27.546553 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861cd3b1abcf47c47587ea9713afca390bf5ae45aa7b9a4176e29c613c41906a\": container with ID starting with 861cd3b1abcf47c47587ea9713afca390bf5ae45aa7b9a4176e29c613c41906a not found: ID does not exist" containerID="861cd3b1abcf47c47587ea9713afca390bf5ae45aa7b9a4176e29c613c41906a" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.546586 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861cd3b1abcf47c47587ea9713afca390bf5ae45aa7b9a4176e29c613c41906a"} err="failed to get container status \"861cd3b1abcf47c47587ea9713afca390bf5ae45aa7b9a4176e29c613c41906a\": rpc error: code = NotFound desc = could not find container \"861cd3b1abcf47c47587ea9713afca390bf5ae45aa7b9a4176e29c613c41906a\": container with ID starting with 861cd3b1abcf47c47587ea9713afca390bf5ae45aa7b9a4176e29c613c41906a not found: ID does not exist" Jan 30 21:50:27 crc kubenswrapper[4914]: I0130 21:50:27.833156 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd55fd4-f0d1-4cc4-9f83-3b309547f888" path="/var/lib/kubelet/pods/fdd55fd4-f0d1-4cc4-9f83-3b309547f888/volumes" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.570270 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pgngj"] Jan 30 21:50:36 crc kubenswrapper[4914]: E0130 21:50:36.571337 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd55fd4-f0d1-4cc4-9f83-3b309547f888" containerName="extract-content" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.571354 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd55fd4-f0d1-4cc4-9f83-3b309547f888" containerName="extract-content" Jan 30 21:50:36 crc kubenswrapper[4914]: E0130 21:50:36.571375 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd55fd4-f0d1-4cc4-9f83-3b309547f888" containerName="registry-server" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.571385 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd55fd4-f0d1-4cc4-9f83-3b309547f888" containerName="registry-server" Jan 30 21:50:36 crc kubenswrapper[4914]: E0130 21:50:36.571416 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd55fd4-f0d1-4cc4-9f83-3b309547f888" containerName="extract-utilities" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.571425 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd55fd4-f0d1-4cc4-9f83-3b309547f888" containerName="extract-utilities" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.571691 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd55fd4-f0d1-4cc4-9f83-3b309547f888" containerName="registry-server" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.574557 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.594343 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pgngj"] Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.663216 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34450472-8d4e-40db-a119-5cf53cdf72a8-catalog-content\") pod \"certified-operators-pgngj\" (UID: \"34450472-8d4e-40db-a119-5cf53cdf72a8\") " pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.663354 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34450472-8d4e-40db-a119-5cf53cdf72a8-utilities\") pod \"certified-operators-pgngj\" (UID: \"34450472-8d4e-40db-a119-5cf53cdf72a8\") " pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.663398 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fc9g\" (UniqueName: \"kubernetes.io/projected/34450472-8d4e-40db-a119-5cf53cdf72a8-kube-api-access-9fc9g\") pod \"certified-operators-pgngj\" (UID: \"34450472-8d4e-40db-a119-5cf53cdf72a8\") " pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.765283 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34450472-8d4e-40db-a119-5cf53cdf72a8-catalog-content\") pod \"certified-operators-pgngj\" (UID: \"34450472-8d4e-40db-a119-5cf53cdf72a8\") " pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.765860 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34450472-8d4e-40db-a119-5cf53cdf72a8-catalog-content\") pod \"certified-operators-pgngj\" (UID: \"34450472-8d4e-40db-a119-5cf53cdf72a8\") " pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.766103 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34450472-8d4e-40db-a119-5cf53cdf72a8-utilities\") pod \"certified-operators-pgngj\" (UID: \"34450472-8d4e-40db-a119-5cf53cdf72a8\") " pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.766153 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fc9g\" (UniqueName: \"kubernetes.io/projected/34450472-8d4e-40db-a119-5cf53cdf72a8-kube-api-access-9fc9g\") pod \"certified-operators-pgngj\" (UID: \"34450472-8d4e-40db-a119-5cf53cdf72a8\") " pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.766607 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34450472-8d4e-40db-a119-5cf53cdf72a8-utilities\") pod \"certified-operators-pgngj\" (UID: \"34450472-8d4e-40db-a119-5cf53cdf72a8\") " pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.798048 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fc9g\" (UniqueName: \"kubernetes.io/projected/34450472-8d4e-40db-a119-5cf53cdf72a8-kube-api-access-9fc9g\") pod \"certified-operators-pgngj\" (UID: \"34450472-8d4e-40db-a119-5cf53cdf72a8\") " pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:36 crc kubenswrapper[4914]: I0130 21:50:36.907318 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:37 crc kubenswrapper[4914]: I0130 21:50:37.444910 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pgngj"] Jan 30 21:50:37 crc kubenswrapper[4914]: I0130 21:50:37.561630 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pgngj" event={"ID":"34450472-8d4e-40db-a119-5cf53cdf72a8","Type":"ContainerStarted","Data":"eff26569a5155340ac8fa5b54a7263ed441f48bf6065bb60eb00fc597aa47b40"} Jan 30 21:50:38 crc kubenswrapper[4914]: I0130 21:50:38.581258 4914 generic.go:334] "Generic (PLEG): container finished" podID="34450472-8d4e-40db-a119-5cf53cdf72a8" containerID="5c674bf9594c55e79f545a812d5e9faf129711b0c36d90063be9a8f779c4a575" exitCode=0 Jan 30 21:50:38 crc kubenswrapper[4914]: I0130 21:50:38.581373 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pgngj" event={"ID":"34450472-8d4e-40db-a119-5cf53cdf72a8","Type":"ContainerDied","Data":"5c674bf9594c55e79f545a812d5e9faf129711b0c36d90063be9a8f779c4a575"} Jan 30 21:50:40 crc kubenswrapper[4914]: I0130 21:50:40.599042 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pgngj" event={"ID":"34450472-8d4e-40db-a119-5cf53cdf72a8","Type":"ContainerStarted","Data":"5f458b40a8cd7b49527c53bc1af383a18ccb5a396754f5009e7e33832493153d"} Jan 30 21:50:41 crc kubenswrapper[4914]: I0130 21:50:41.611009 4914 generic.go:334] "Generic (PLEG): container finished" podID="34450472-8d4e-40db-a119-5cf53cdf72a8" containerID="5f458b40a8cd7b49527c53bc1af383a18ccb5a396754f5009e7e33832493153d" exitCode=0 Jan 30 21:50:41 crc kubenswrapper[4914]: I0130 21:50:41.611134 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pgngj" event={"ID":"34450472-8d4e-40db-a119-5cf53cdf72a8","Type":"ContainerDied","Data":"5f458b40a8cd7b49527c53bc1af383a18ccb5a396754f5009e7e33832493153d"} Jan 30 21:50:42 crc kubenswrapper[4914]: I0130 21:50:42.623817 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pgngj" event={"ID":"34450472-8d4e-40db-a119-5cf53cdf72a8","Type":"ContainerStarted","Data":"6466e6cc56e0c80a904e9427197dcb958d7cd6595d2a0f649e49568a5e2a3929"} Jan 30 21:50:42 crc kubenswrapper[4914]: I0130 21:50:42.647676 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pgngj" podStartSLOduration=3.174289485 podStartE2EDuration="6.647653793s" podCreationTimestamp="2026-01-30 21:50:36 +0000 UTC" firstStartedPulling="2026-01-30 21:50:38.585065604 +0000 UTC m=+2172.023702405" lastFinishedPulling="2026-01-30 21:50:42.058429952 +0000 UTC m=+2175.497066713" observedRunningTime="2026-01-30 21:50:42.645614102 +0000 UTC m=+2176.084250863" watchObservedRunningTime="2026-01-30 21:50:42.647653793 +0000 UTC m=+2176.086290554" Jan 30 21:50:44 crc kubenswrapper[4914]: I0130 21:50:44.652380 4914 generic.go:334] "Generic (PLEG): container finished" podID="7f19f9c6-b274-40bd-9693-b26eb56bbe0a" containerID="9a3f6a49b0f9f84e28504a38be2164fa16db45d7dae5e85bba830441073f1911" exitCode=0 Jan 30 21:50:44 crc kubenswrapper[4914]: I0130 21:50:44.652477 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" event={"ID":"7f19f9c6-b274-40bd-9693-b26eb56bbe0a","Type":"ContainerDied","Data":"9a3f6a49b0f9f84e28504a38be2164fa16db45d7dae5e85bba830441073f1911"} Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.216092 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.280560 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-nova-metadata-neutron-config-0\") pod \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.281022 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mrm9\" (UniqueName: \"kubernetes.io/projected/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-kube-api-access-4mrm9\") pod \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.282649 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-inventory\") pod \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.282695 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-ssh-key-openstack-edpm-ipam\") pod \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.282975 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-neutron-metadata-combined-ca-bundle\") pod \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.283280 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\" (UID: \"7f19f9c6-b274-40bd-9693-b26eb56bbe0a\") " Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.287278 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-kube-api-access-4mrm9" (OuterVolumeSpecName: "kube-api-access-4mrm9") pod "7f19f9c6-b274-40bd-9693-b26eb56bbe0a" (UID: "7f19f9c6-b274-40bd-9693-b26eb56bbe0a"). InnerVolumeSpecName "kube-api-access-4mrm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.290214 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7f19f9c6-b274-40bd-9693-b26eb56bbe0a" (UID: "7f19f9c6-b274-40bd-9693-b26eb56bbe0a"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.313345 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "7f19f9c6-b274-40bd-9693-b26eb56bbe0a" (UID: "7f19f9c6-b274-40bd-9693-b26eb56bbe0a"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.323674 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7f19f9c6-b274-40bd-9693-b26eb56bbe0a" (UID: "7f19f9c6-b274-40bd-9693-b26eb56bbe0a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.323938 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "7f19f9c6-b274-40bd-9693-b26eb56bbe0a" (UID: "7f19f9c6-b274-40bd-9693-b26eb56bbe0a"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.344879 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-inventory" (OuterVolumeSpecName: "inventory") pod "7f19f9c6-b274-40bd-9693-b26eb56bbe0a" (UID: "7f19f9c6-b274-40bd-9693-b26eb56bbe0a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.387308 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.387346 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.387361 4914 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.387373 4914 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.387389 4914 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.387401 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mrm9\" (UniqueName: \"kubernetes.io/projected/7f19f9c6-b274-40bd-9693-b26eb56bbe0a-kube-api-access-4mrm9\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.673635 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" event={"ID":"7f19f9c6-b274-40bd-9693-b26eb56bbe0a","Type":"ContainerDied","Data":"49a7980ff403c4243a7895931b2110b121d340bd0141af825431d0de1890cbb4"} Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.673677 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.673684 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49a7980ff403c4243a7895931b2110b121d340bd0141af825431d0de1890cbb4" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.825183 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d"] Jan 30 21:50:46 crc kubenswrapper[4914]: E0130 21:50:46.825604 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f19f9c6-b274-40bd-9693-b26eb56bbe0a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.825622 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f19f9c6-b274-40bd-9693-b26eb56bbe0a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.827264 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f19f9c6-b274-40bd-9693-b26eb56bbe0a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.828249 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.830698 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.830725 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.830759 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.831610 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.831891 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.843725 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d"] Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.925552 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.929483 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:46 crc kubenswrapper[4914]: I0130 21:50:46.970364 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.029086 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvn47\" (UniqueName: \"kubernetes.io/projected/90248a7e-c99e-4777-8767-3694c7a5b588-kube-api-access-nvn47\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.029167 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.029275 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.029295 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.029319 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.131626 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvn47\" (UniqueName: \"kubernetes.io/projected/90248a7e-c99e-4777-8767-3694c7a5b588-kube-api-access-nvn47\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.131987 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.132175 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.132290 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.132434 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.137899 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.137975 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.138671 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.148283 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.150075 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvn47\" (UniqueName: \"kubernetes.io/projected/90248a7e-c99e-4777-8767-3694c7a5b588-kube-api-access-nvn47\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.449938 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.748372 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:47 crc kubenswrapper[4914]: I0130 21:50:47.795632 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pgngj"] Jan 30 21:50:48 crc kubenswrapper[4914]: I0130 21:50:48.005000 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d"] Jan 30 21:50:48 crc kubenswrapper[4914]: I0130 21:50:48.014373 4914 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:50:48 crc kubenswrapper[4914]: I0130 21:50:48.694731 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" event={"ID":"90248a7e-c99e-4777-8767-3694c7a5b588","Type":"ContainerStarted","Data":"76eebefa227c9d75ff4d19227c11824a2a64ed256376cd30738c1e94d84d8950"} Jan 30 21:50:49 crc kubenswrapper[4914]: I0130 21:50:49.703256 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" event={"ID":"90248a7e-c99e-4777-8767-3694c7a5b588","Type":"ContainerStarted","Data":"eb953990d9292a2e30ed52e97fae3d63caf4b531379ccd61dbe9b81f9fd5bfed"} Jan 30 21:50:49 crc kubenswrapper[4914]: I0130 21:50:49.703459 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pgngj" podUID="34450472-8d4e-40db-a119-5cf53cdf72a8" containerName="registry-server" containerID="cri-o://6466e6cc56e0c80a904e9427197dcb958d7cd6595d2a0f649e49568a5e2a3929" gracePeriod=2 Jan 30 21:50:49 crc kubenswrapper[4914]: I0130 21:50:49.724238 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" podStartSLOduration=3.2884188549999998 podStartE2EDuration="3.724217882s" podCreationTimestamp="2026-01-30 21:50:46 +0000 UTC" firstStartedPulling="2026-01-30 21:50:48.013972854 +0000 UTC m=+2181.452609645" lastFinishedPulling="2026-01-30 21:50:48.449771911 +0000 UTC m=+2181.888408672" observedRunningTime="2026-01-30 21:50:49.719901694 +0000 UTC m=+2183.158538455" watchObservedRunningTime="2026-01-30 21:50:49.724217882 +0000 UTC m=+2183.162854643" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.289870 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.401864 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fc9g\" (UniqueName: \"kubernetes.io/projected/34450472-8d4e-40db-a119-5cf53cdf72a8-kube-api-access-9fc9g\") pod \"34450472-8d4e-40db-a119-5cf53cdf72a8\" (UID: \"34450472-8d4e-40db-a119-5cf53cdf72a8\") " Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.402042 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34450472-8d4e-40db-a119-5cf53cdf72a8-utilities\") pod \"34450472-8d4e-40db-a119-5cf53cdf72a8\" (UID: \"34450472-8d4e-40db-a119-5cf53cdf72a8\") " Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.402205 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34450472-8d4e-40db-a119-5cf53cdf72a8-catalog-content\") pod \"34450472-8d4e-40db-a119-5cf53cdf72a8\" (UID: \"34450472-8d4e-40db-a119-5cf53cdf72a8\") " Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.403592 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34450472-8d4e-40db-a119-5cf53cdf72a8-utilities" (OuterVolumeSpecName: "utilities") pod "34450472-8d4e-40db-a119-5cf53cdf72a8" (UID: "34450472-8d4e-40db-a119-5cf53cdf72a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.408996 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34450472-8d4e-40db-a119-5cf53cdf72a8-kube-api-access-9fc9g" (OuterVolumeSpecName: "kube-api-access-9fc9g") pod "34450472-8d4e-40db-a119-5cf53cdf72a8" (UID: "34450472-8d4e-40db-a119-5cf53cdf72a8"). InnerVolumeSpecName "kube-api-access-9fc9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.458549 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34450472-8d4e-40db-a119-5cf53cdf72a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34450472-8d4e-40db-a119-5cf53cdf72a8" (UID: "34450472-8d4e-40db-a119-5cf53cdf72a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.504620 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fc9g\" (UniqueName: \"kubernetes.io/projected/34450472-8d4e-40db-a119-5cf53cdf72a8-kube-api-access-9fc9g\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.504882 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34450472-8d4e-40db-a119-5cf53cdf72a8-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.504956 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34450472-8d4e-40db-a119-5cf53cdf72a8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.715929 4914 generic.go:334] "Generic (PLEG): container finished" podID="34450472-8d4e-40db-a119-5cf53cdf72a8" containerID="6466e6cc56e0c80a904e9427197dcb958d7cd6595d2a0f649e49568a5e2a3929" exitCode=0 Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.715995 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pgngj" event={"ID":"34450472-8d4e-40db-a119-5cf53cdf72a8","Type":"ContainerDied","Data":"6466e6cc56e0c80a904e9427197dcb958d7cd6595d2a0f649e49568a5e2a3929"} Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.716276 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pgngj" event={"ID":"34450472-8d4e-40db-a119-5cf53cdf72a8","Type":"ContainerDied","Data":"eff26569a5155340ac8fa5b54a7263ed441f48bf6065bb60eb00fc597aa47b40"} Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.716314 4914 scope.go:117] "RemoveContainer" containerID="6466e6cc56e0c80a904e9427197dcb958d7cd6595d2a0f649e49568a5e2a3929" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.716029 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pgngj" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.741905 4914 scope.go:117] "RemoveContainer" containerID="5f458b40a8cd7b49527c53bc1af383a18ccb5a396754f5009e7e33832493153d" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.750572 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pgngj"] Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.763674 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pgngj"] Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.778355 4914 scope.go:117] "RemoveContainer" containerID="5c674bf9594c55e79f545a812d5e9faf129711b0c36d90063be9a8f779c4a575" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.816904 4914 scope.go:117] "RemoveContainer" containerID="6466e6cc56e0c80a904e9427197dcb958d7cd6595d2a0f649e49568a5e2a3929" Jan 30 21:50:50 crc kubenswrapper[4914]: E0130 21:50:50.817291 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6466e6cc56e0c80a904e9427197dcb958d7cd6595d2a0f649e49568a5e2a3929\": container with ID starting with 6466e6cc56e0c80a904e9427197dcb958d7cd6595d2a0f649e49568a5e2a3929 not found: ID does not exist" containerID="6466e6cc56e0c80a904e9427197dcb958d7cd6595d2a0f649e49568a5e2a3929" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.817322 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6466e6cc56e0c80a904e9427197dcb958d7cd6595d2a0f649e49568a5e2a3929"} err="failed to get container status \"6466e6cc56e0c80a904e9427197dcb958d7cd6595d2a0f649e49568a5e2a3929\": rpc error: code = NotFound desc = could not find container \"6466e6cc56e0c80a904e9427197dcb958d7cd6595d2a0f649e49568a5e2a3929\": container with ID starting with 6466e6cc56e0c80a904e9427197dcb958d7cd6595d2a0f649e49568a5e2a3929 not found: ID does not exist" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.817344 4914 scope.go:117] "RemoveContainer" containerID="5f458b40a8cd7b49527c53bc1af383a18ccb5a396754f5009e7e33832493153d" Jan 30 21:50:50 crc kubenswrapper[4914]: E0130 21:50:50.817823 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f458b40a8cd7b49527c53bc1af383a18ccb5a396754f5009e7e33832493153d\": container with ID starting with 5f458b40a8cd7b49527c53bc1af383a18ccb5a396754f5009e7e33832493153d not found: ID does not exist" containerID="5f458b40a8cd7b49527c53bc1af383a18ccb5a396754f5009e7e33832493153d" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.817864 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f458b40a8cd7b49527c53bc1af383a18ccb5a396754f5009e7e33832493153d"} err="failed to get container status \"5f458b40a8cd7b49527c53bc1af383a18ccb5a396754f5009e7e33832493153d\": rpc error: code = NotFound desc = could not find container \"5f458b40a8cd7b49527c53bc1af383a18ccb5a396754f5009e7e33832493153d\": container with ID starting with 5f458b40a8cd7b49527c53bc1af383a18ccb5a396754f5009e7e33832493153d not found: ID does not exist" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.817895 4914 scope.go:117] "RemoveContainer" containerID="5c674bf9594c55e79f545a812d5e9faf129711b0c36d90063be9a8f779c4a575" Jan 30 21:50:50 crc kubenswrapper[4914]: E0130 21:50:50.818219 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c674bf9594c55e79f545a812d5e9faf129711b0c36d90063be9a8f779c4a575\": container with ID starting with 5c674bf9594c55e79f545a812d5e9faf129711b0c36d90063be9a8f779c4a575 not found: ID does not exist" containerID="5c674bf9594c55e79f545a812d5e9faf129711b0c36d90063be9a8f779c4a575" Jan 30 21:50:50 crc kubenswrapper[4914]: I0130 21:50:50.818245 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c674bf9594c55e79f545a812d5e9faf129711b0c36d90063be9a8f779c4a575"} err="failed to get container status \"5c674bf9594c55e79f545a812d5e9faf129711b0c36d90063be9a8f779c4a575\": rpc error: code = NotFound desc = could not find container \"5c674bf9594c55e79f545a812d5e9faf129711b0c36d90063be9a8f779c4a575\": container with ID starting with 5c674bf9594c55e79f545a812d5e9faf129711b0c36d90063be9a8f779c4a575 not found: ID does not exist" Jan 30 21:50:51 crc kubenswrapper[4914]: I0130 21:50:51.830149 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34450472-8d4e-40db-a119-5cf53cdf72a8" path="/var/lib/kubelet/pods/34450472-8d4e-40db-a119-5cf53cdf72a8/volumes" Jan 30 21:50:56 crc kubenswrapper[4914]: I0130 21:50:56.982972 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:50:56 crc kubenswrapper[4914]: I0130 21:50:56.983648 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:50:56 crc kubenswrapper[4914]: I0130 21:50:56.983758 4914 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:50:56 crc kubenswrapper[4914]: I0130 21:50:56.984825 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2beeffdd2e3a30f174e411bd48f6951bdc1c5b950b8351ad0c9f10106fc74a69"} pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:50:56 crc kubenswrapper[4914]: I0130 21:50:56.984910 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" containerID="cri-o://2beeffdd2e3a30f174e411bd48f6951bdc1c5b950b8351ad0c9f10106fc74a69" gracePeriod=600 Jan 30 21:50:57 crc kubenswrapper[4914]: I0130 21:50:57.815485 4914 generic.go:334] "Generic (PLEG): container finished" podID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerID="2beeffdd2e3a30f174e411bd48f6951bdc1c5b950b8351ad0c9f10106fc74a69" exitCode=0 Jan 30 21:50:57 crc kubenswrapper[4914]: I0130 21:50:57.815825 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerDied","Data":"2beeffdd2e3a30f174e411bd48f6951bdc1c5b950b8351ad0c9f10106fc74a69"} Jan 30 21:50:57 crc kubenswrapper[4914]: I0130 21:50:57.816075 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5"} Jan 30 21:50:57 crc kubenswrapper[4914]: I0130 21:50:57.816103 4914 scope.go:117] "RemoveContainer" containerID="1a6013c0ec186b8427ea925cf86d18151e04b8731f5311ab58181c5d24389a56" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.004526 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pqmpk"] Jan 30 21:50:58 crc kubenswrapper[4914]: E0130 21:50:58.004928 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34450472-8d4e-40db-a119-5cf53cdf72a8" containerName="extract-utilities" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.004940 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="34450472-8d4e-40db-a119-5cf53cdf72a8" containerName="extract-utilities" Jan 30 21:50:58 crc kubenswrapper[4914]: E0130 21:50:58.004949 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34450472-8d4e-40db-a119-5cf53cdf72a8" containerName="extract-content" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.004954 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="34450472-8d4e-40db-a119-5cf53cdf72a8" containerName="extract-content" Jan 30 21:50:58 crc kubenswrapper[4914]: E0130 21:50:58.004986 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34450472-8d4e-40db-a119-5cf53cdf72a8" containerName="registry-server" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.004992 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="34450472-8d4e-40db-a119-5cf53cdf72a8" containerName="registry-server" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.005200 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="34450472-8d4e-40db-a119-5cf53cdf72a8" containerName="registry-server" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.006540 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.047640 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqmpk"] Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.090123 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-utilities\") pod \"redhat-marketplace-pqmpk\" (UID: \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\") " pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.090256 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgzz6\" (UniqueName: \"kubernetes.io/projected/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-kube-api-access-wgzz6\") pod \"redhat-marketplace-pqmpk\" (UID: \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\") " pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.090316 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-catalog-content\") pod \"redhat-marketplace-pqmpk\" (UID: \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\") " pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.192275 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-utilities\") pod \"redhat-marketplace-pqmpk\" (UID: \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\") " pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.192356 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgzz6\" (UniqueName: \"kubernetes.io/projected/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-kube-api-access-wgzz6\") pod \"redhat-marketplace-pqmpk\" (UID: \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\") " pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.192391 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-catalog-content\") pod \"redhat-marketplace-pqmpk\" (UID: \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\") " pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.192793 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-utilities\") pod \"redhat-marketplace-pqmpk\" (UID: \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\") " pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.192923 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-catalog-content\") pod \"redhat-marketplace-pqmpk\" (UID: \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\") " pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.213909 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgzz6\" (UniqueName: \"kubernetes.io/projected/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-kube-api-access-wgzz6\") pod \"redhat-marketplace-pqmpk\" (UID: \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\") " pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.337405 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:50:58 crc kubenswrapper[4914]: I0130 21:50:58.840214 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqmpk"] Jan 30 21:50:59 crc kubenswrapper[4914]: I0130 21:50:59.841122 4914 generic.go:334] "Generic (PLEG): container finished" podID="b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" containerID="3d3bdf88125eefec735bc7862d5e87fa8ecd29838c888fd02b0597fadcd74e23" exitCode=0 Jan 30 21:50:59 crc kubenswrapper[4914]: I0130 21:50:59.841188 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqmpk" event={"ID":"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7","Type":"ContainerDied","Data":"3d3bdf88125eefec735bc7862d5e87fa8ecd29838c888fd02b0597fadcd74e23"} Jan 30 21:50:59 crc kubenswrapper[4914]: I0130 21:50:59.841764 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqmpk" event={"ID":"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7","Type":"ContainerStarted","Data":"f01b3b0cb95ccd56c57eb6e13447115105e60b6d591751abc1c2c9938c713c45"} Jan 30 21:51:01 crc kubenswrapper[4914]: I0130 21:51:01.863719 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqmpk" event={"ID":"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7","Type":"ContainerStarted","Data":"3bdab2af8eaf27a8cb1373704b5297bd38541ed8874e1764187af72a12d88396"} Jan 30 21:51:02 crc kubenswrapper[4914]: I0130 21:51:02.875182 4914 generic.go:334] "Generic (PLEG): container finished" podID="b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" containerID="3bdab2af8eaf27a8cb1373704b5297bd38541ed8874e1764187af72a12d88396" exitCode=0 Jan 30 21:51:02 crc kubenswrapper[4914]: I0130 21:51:02.875281 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqmpk" event={"ID":"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7","Type":"ContainerDied","Data":"3bdab2af8eaf27a8cb1373704b5297bd38541ed8874e1764187af72a12d88396"} Jan 30 21:51:03 crc kubenswrapper[4914]: I0130 21:51:03.891583 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqmpk" event={"ID":"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7","Type":"ContainerStarted","Data":"c24753de67306ed2b66dedc54c2781f2937fedc6b62acc186ca324aee3a73f05"} Jan 30 21:51:03 crc kubenswrapper[4914]: I0130 21:51:03.917131 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pqmpk" podStartSLOduration=3.35389368 podStartE2EDuration="6.917111273s" podCreationTimestamp="2026-01-30 21:50:57 +0000 UTC" firstStartedPulling="2026-01-30 21:50:59.843657302 +0000 UTC m=+2193.282294063" lastFinishedPulling="2026-01-30 21:51:03.406874895 +0000 UTC m=+2196.845511656" observedRunningTime="2026-01-30 21:51:03.911724048 +0000 UTC m=+2197.350360819" watchObservedRunningTime="2026-01-30 21:51:03.917111273 +0000 UTC m=+2197.355748034" Jan 30 21:51:08 crc kubenswrapper[4914]: I0130 21:51:08.338321 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:51:08 crc kubenswrapper[4914]: I0130 21:51:08.338967 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:51:08 crc kubenswrapper[4914]: I0130 21:51:08.392796 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:51:08 crc kubenswrapper[4914]: I0130 21:51:08.995800 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:51:09 crc kubenswrapper[4914]: I0130 21:51:09.051030 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqmpk"] Jan 30 21:51:10 crc kubenswrapper[4914]: I0130 21:51:10.961491 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pqmpk" podUID="b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" containerName="registry-server" containerID="cri-o://c24753de67306ed2b66dedc54c2781f2937fedc6b62acc186ca324aee3a73f05" gracePeriod=2 Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.446906 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.581379 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-catalog-content\") pod \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\" (UID: \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\") " Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.581636 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-utilities\") pod \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\" (UID: \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\") " Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.581737 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgzz6\" (UniqueName: \"kubernetes.io/projected/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-kube-api-access-wgzz6\") pod \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\" (UID: \"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7\") " Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.582591 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-utilities" (OuterVolumeSpecName: "utilities") pod "b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" (UID: "b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.601252 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-kube-api-access-wgzz6" (OuterVolumeSpecName: "kube-api-access-wgzz6") pod "b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" (UID: "b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7"). InnerVolumeSpecName "kube-api-access-wgzz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.633432 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" (UID: "b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.684156 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.684182 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgzz6\" (UniqueName: \"kubernetes.io/projected/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-kube-api-access-wgzz6\") on node \"crc\" DevicePath \"\"" Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.684194 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.974559 4914 generic.go:334] "Generic (PLEG): container finished" podID="b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" containerID="c24753de67306ed2b66dedc54c2781f2937fedc6b62acc186ca324aee3a73f05" exitCode=0 Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.974616 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqmpk" event={"ID":"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7","Type":"ContainerDied","Data":"c24753de67306ed2b66dedc54c2781f2937fedc6b62acc186ca324aee3a73f05"} Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.974677 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqmpk" event={"ID":"b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7","Type":"ContainerDied","Data":"f01b3b0cb95ccd56c57eb6e13447115105e60b6d591751abc1c2c9938c713c45"} Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.974629 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqmpk" Jan 30 21:51:11 crc kubenswrapper[4914]: I0130 21:51:11.974733 4914 scope.go:117] "RemoveContainer" containerID="c24753de67306ed2b66dedc54c2781f2937fedc6b62acc186ca324aee3a73f05" Jan 30 21:51:12 crc kubenswrapper[4914]: I0130 21:51:12.011591 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqmpk"] Jan 30 21:51:12 crc kubenswrapper[4914]: I0130 21:51:12.013850 4914 scope.go:117] "RemoveContainer" containerID="3bdab2af8eaf27a8cb1373704b5297bd38541ed8874e1764187af72a12d88396" Jan 30 21:51:12 crc kubenswrapper[4914]: I0130 21:51:12.025640 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqmpk"] Jan 30 21:51:12 crc kubenswrapper[4914]: I0130 21:51:12.039482 4914 scope.go:117] "RemoveContainer" containerID="3d3bdf88125eefec735bc7862d5e87fa8ecd29838c888fd02b0597fadcd74e23" Jan 30 21:51:12 crc kubenswrapper[4914]: I0130 21:51:12.091255 4914 scope.go:117] "RemoveContainer" containerID="c24753de67306ed2b66dedc54c2781f2937fedc6b62acc186ca324aee3a73f05" Jan 30 21:51:12 crc kubenswrapper[4914]: E0130 21:51:12.091962 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c24753de67306ed2b66dedc54c2781f2937fedc6b62acc186ca324aee3a73f05\": container with ID starting with c24753de67306ed2b66dedc54c2781f2937fedc6b62acc186ca324aee3a73f05 not found: ID does not exist" containerID="c24753de67306ed2b66dedc54c2781f2937fedc6b62acc186ca324aee3a73f05" Jan 30 21:51:12 crc kubenswrapper[4914]: I0130 21:51:12.091998 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24753de67306ed2b66dedc54c2781f2937fedc6b62acc186ca324aee3a73f05"} err="failed to get container status \"c24753de67306ed2b66dedc54c2781f2937fedc6b62acc186ca324aee3a73f05\": rpc error: code = NotFound desc = could not find container \"c24753de67306ed2b66dedc54c2781f2937fedc6b62acc186ca324aee3a73f05\": container with ID starting with c24753de67306ed2b66dedc54c2781f2937fedc6b62acc186ca324aee3a73f05 not found: ID does not exist" Jan 30 21:51:12 crc kubenswrapper[4914]: I0130 21:51:12.092040 4914 scope.go:117] "RemoveContainer" containerID="3bdab2af8eaf27a8cb1373704b5297bd38541ed8874e1764187af72a12d88396" Jan 30 21:51:12 crc kubenswrapper[4914]: E0130 21:51:12.092529 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bdab2af8eaf27a8cb1373704b5297bd38541ed8874e1764187af72a12d88396\": container with ID starting with 3bdab2af8eaf27a8cb1373704b5297bd38541ed8874e1764187af72a12d88396 not found: ID does not exist" containerID="3bdab2af8eaf27a8cb1373704b5297bd38541ed8874e1764187af72a12d88396" Jan 30 21:51:12 crc kubenswrapper[4914]: I0130 21:51:12.092571 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bdab2af8eaf27a8cb1373704b5297bd38541ed8874e1764187af72a12d88396"} err="failed to get container status \"3bdab2af8eaf27a8cb1373704b5297bd38541ed8874e1764187af72a12d88396\": rpc error: code = NotFound desc = could not find container \"3bdab2af8eaf27a8cb1373704b5297bd38541ed8874e1764187af72a12d88396\": container with ID starting with 3bdab2af8eaf27a8cb1373704b5297bd38541ed8874e1764187af72a12d88396 not found: ID does not exist" Jan 30 21:51:12 crc kubenswrapper[4914]: I0130 21:51:12.092589 4914 scope.go:117] "RemoveContainer" containerID="3d3bdf88125eefec735bc7862d5e87fa8ecd29838c888fd02b0597fadcd74e23" Jan 30 21:51:12 crc kubenswrapper[4914]: E0130 21:51:12.093003 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d3bdf88125eefec735bc7862d5e87fa8ecd29838c888fd02b0597fadcd74e23\": container with ID starting with 3d3bdf88125eefec735bc7862d5e87fa8ecd29838c888fd02b0597fadcd74e23 not found: ID does not exist" containerID="3d3bdf88125eefec735bc7862d5e87fa8ecd29838c888fd02b0597fadcd74e23" Jan 30 21:51:12 crc kubenswrapper[4914]: I0130 21:51:12.093048 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d3bdf88125eefec735bc7862d5e87fa8ecd29838c888fd02b0597fadcd74e23"} err="failed to get container status \"3d3bdf88125eefec735bc7862d5e87fa8ecd29838c888fd02b0597fadcd74e23\": rpc error: code = NotFound desc = could not find container \"3d3bdf88125eefec735bc7862d5e87fa8ecd29838c888fd02b0597fadcd74e23\": container with ID starting with 3d3bdf88125eefec735bc7862d5e87fa8ecd29838c888fd02b0597fadcd74e23 not found: ID does not exist" Jan 30 21:51:13 crc kubenswrapper[4914]: I0130 21:51:13.829857 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" path="/var/lib/kubelet/pods/b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7/volumes" Jan 30 21:53:26 crc kubenswrapper[4914]: I0130 21:53:26.983555 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:53:26 crc kubenswrapper[4914]: I0130 21:53:26.984192 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:53:56 crc kubenswrapper[4914]: I0130 21:53:56.983380 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:53:56 crc kubenswrapper[4914]: I0130 21:53:56.984084 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:54:26 crc kubenswrapper[4914]: I0130 21:54:26.983135 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:54:26 crc kubenswrapper[4914]: I0130 21:54:26.983745 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:54:26 crc kubenswrapper[4914]: I0130 21:54:26.983793 4914 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 21:54:26 crc kubenswrapper[4914]: I0130 21:54:26.984610 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5"} pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:54:26 crc kubenswrapper[4914]: I0130 21:54:26.984658 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" containerID="cri-o://97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" gracePeriod=600 Jan 30 21:54:27 crc kubenswrapper[4914]: E0130 21:54:27.119307 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:54:27 crc kubenswrapper[4914]: I0130 21:54:27.933067 4914 generic.go:334] "Generic (PLEG): container finished" podID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" exitCode=0 Jan 30 21:54:27 crc kubenswrapper[4914]: I0130 21:54:27.933110 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerDied","Data":"97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5"} Jan 30 21:54:27 crc kubenswrapper[4914]: I0130 21:54:27.933163 4914 scope.go:117] "RemoveContainer" containerID="2beeffdd2e3a30f174e411bd48f6951bdc1c5b950b8351ad0c9f10106fc74a69" Jan 30 21:54:27 crc kubenswrapper[4914]: I0130 21:54:27.934013 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:54:27 crc kubenswrapper[4914]: E0130 21:54:27.934345 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:54:40 crc kubenswrapper[4914]: I0130 21:54:40.818128 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:54:40 crc kubenswrapper[4914]: E0130 21:54:40.819612 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:54:55 crc kubenswrapper[4914]: I0130 21:54:55.190796 4914 generic.go:334] "Generic (PLEG): container finished" podID="90248a7e-c99e-4777-8767-3694c7a5b588" containerID="eb953990d9292a2e30ed52e97fae3d63caf4b531379ccd61dbe9b81f9fd5bfed" exitCode=0 Jan 30 21:54:55 crc kubenswrapper[4914]: I0130 21:54:55.190870 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" event={"ID":"90248a7e-c99e-4777-8767-3694c7a5b588","Type":"ContainerDied","Data":"eb953990d9292a2e30ed52e97fae3d63caf4b531379ccd61dbe9b81f9fd5bfed"} Jan 30 21:54:55 crc kubenswrapper[4914]: I0130 21:54:55.818220 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:54:55 crc kubenswrapper[4914]: E0130 21:54:55.818639 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.708107 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.876957 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-ssh-key-openstack-edpm-ipam\") pod \"90248a7e-c99e-4777-8767-3694c7a5b588\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.877484 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-inventory\") pod \"90248a7e-c99e-4777-8767-3694c7a5b588\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.877556 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvn47\" (UniqueName: \"kubernetes.io/projected/90248a7e-c99e-4777-8767-3694c7a5b588-kube-api-access-nvn47\") pod \"90248a7e-c99e-4777-8767-3694c7a5b588\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.877622 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-libvirt-secret-0\") pod \"90248a7e-c99e-4777-8767-3694c7a5b588\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.877641 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-libvirt-combined-ca-bundle\") pod \"90248a7e-c99e-4777-8767-3694c7a5b588\" (UID: \"90248a7e-c99e-4777-8767-3694c7a5b588\") " Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.884172 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90248a7e-c99e-4777-8767-3694c7a5b588-kube-api-access-nvn47" (OuterVolumeSpecName: "kube-api-access-nvn47") pod "90248a7e-c99e-4777-8767-3694c7a5b588" (UID: "90248a7e-c99e-4777-8767-3694c7a5b588"). InnerVolumeSpecName "kube-api-access-nvn47". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.891903 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "90248a7e-c99e-4777-8767-3694c7a5b588" (UID: "90248a7e-c99e-4777-8767-3694c7a5b588"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.906957 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-inventory" (OuterVolumeSpecName: "inventory") pod "90248a7e-c99e-4777-8767-3694c7a5b588" (UID: "90248a7e-c99e-4777-8767-3694c7a5b588"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.916465 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "90248a7e-c99e-4777-8767-3694c7a5b588" (UID: "90248a7e-c99e-4777-8767-3694c7a5b588"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.918903 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "90248a7e-c99e-4777-8767-3694c7a5b588" (UID: "90248a7e-c99e-4777-8767-3694c7a5b588"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.981240 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.981294 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvn47\" (UniqueName: \"kubernetes.io/projected/90248a7e-c99e-4777-8767-3694c7a5b588-kube-api-access-nvn47\") on node \"crc\" DevicePath \"\"" Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.981340 4914 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.981351 4914 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:54:56 crc kubenswrapper[4914]: I0130 21:54:56.981360 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90248a7e-c99e-4777-8767-3694c7a5b588-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.214417 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" event={"ID":"90248a7e-c99e-4777-8767-3694c7a5b588","Type":"ContainerDied","Data":"76eebefa227c9d75ff4d19227c11824a2a64ed256376cd30738c1e94d84d8950"} Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.214466 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76eebefa227c9d75ff4d19227c11824a2a64ed256376cd30738c1e94d84d8950" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.214468 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.320941 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z"] Jan 30 21:54:57 crc kubenswrapper[4914]: E0130 21:54:57.321400 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" containerName="registry-server" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.321421 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" containerName="registry-server" Jan 30 21:54:57 crc kubenswrapper[4914]: E0130 21:54:57.321442 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90248a7e-c99e-4777-8767-3694c7a5b588" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.321450 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="90248a7e-c99e-4777-8767-3694c7a5b588" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 30 21:54:57 crc kubenswrapper[4914]: E0130 21:54:57.321458 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" containerName="extract-content" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.321464 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" containerName="extract-content" Jan 30 21:54:57 crc kubenswrapper[4914]: E0130 21:54:57.321481 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" containerName="extract-utilities" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.321486 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" containerName="extract-utilities" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.321661 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d1eebb-ec32-4b2c-8b2f-9a5ad646c5c7" containerName="registry-server" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.321684 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="90248a7e-c99e-4777-8767-3694c7a5b588" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.322564 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.324465 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.325282 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.325540 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.325718 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.325848 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.326449 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.328817 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.341781 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z"] Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.389266 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.389346 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.389426 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.389450 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxxxj\" (UniqueName: \"kubernetes.io/projected/9490e581-cf4d-4139-a77f-5f2b790ea96b-kube-api-access-cxxxj\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.389639 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.389723 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.389824 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.389874 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.389904 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.491950 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.492004 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.492065 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.492130 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.492170 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.492225 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.492298 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.492372 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.492406 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxxxj\" (UniqueName: \"kubernetes.io/projected/9490e581-cf4d-4139-a77f-5f2b790ea96b-kube-api-access-cxxxj\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.494956 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.498542 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.498625 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.498669 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.499210 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.499417 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.499799 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.499966 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.512434 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxxxj\" (UniqueName: \"kubernetes.io/projected/9490e581-cf4d-4139-a77f-5f2b790ea96b-kube-api-access-cxxxj\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zmw7z\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:57 crc kubenswrapper[4914]: I0130 21:54:57.683566 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:54:58 crc kubenswrapper[4914]: I0130 21:54:58.439035 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z"] Jan 30 21:54:59 crc kubenswrapper[4914]: I0130 21:54:59.238293 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" event={"ID":"9490e581-cf4d-4139-a77f-5f2b790ea96b","Type":"ContainerStarted","Data":"13aae9b3d6a1497370e6c3961e09688781e6d9af90575dff3be9ab9edf8c0e58"} Jan 30 21:55:00 crc kubenswrapper[4914]: I0130 21:55:00.248698 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" event={"ID":"9490e581-cf4d-4139-a77f-5f2b790ea96b","Type":"ContainerStarted","Data":"a22fc195296bd9b664ec96ebfe264ab9b07f3485adda62db55bea3879c7e633d"} Jan 30 21:55:00 crc kubenswrapper[4914]: I0130 21:55:00.272228 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" podStartSLOduration=2.014640119 podStartE2EDuration="3.272206264s" podCreationTimestamp="2026-01-30 21:54:57 +0000 UTC" firstStartedPulling="2026-01-30 21:54:58.43591905 +0000 UTC m=+2431.874555811" lastFinishedPulling="2026-01-30 21:54:59.693485195 +0000 UTC m=+2433.132121956" observedRunningTime="2026-01-30 21:55:00.268587571 +0000 UTC m=+2433.707224332" watchObservedRunningTime="2026-01-30 21:55:00.272206264 +0000 UTC m=+2433.710843035" Jan 30 21:55:08 crc kubenswrapper[4914]: I0130 21:55:08.818646 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:55:08 crc kubenswrapper[4914]: E0130 21:55:08.819664 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:55:21 crc kubenswrapper[4914]: I0130 21:55:21.818498 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:55:21 crc kubenswrapper[4914]: E0130 21:55:21.819279 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:55:35 crc kubenswrapper[4914]: I0130 21:55:35.818466 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:55:35 crc kubenswrapper[4914]: E0130 21:55:35.819336 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:55:48 crc kubenswrapper[4914]: I0130 21:55:48.818924 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:55:48 crc kubenswrapper[4914]: E0130 21:55:48.819664 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:56:00 crc kubenswrapper[4914]: I0130 21:56:00.818216 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:56:00 crc kubenswrapper[4914]: E0130 21:56:00.818844 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:56:11 crc kubenswrapper[4914]: I0130 21:56:11.818856 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:56:11 crc kubenswrapper[4914]: E0130 21:56:11.819640 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:56:26 crc kubenswrapper[4914]: I0130 21:56:26.819014 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:56:26 crc kubenswrapper[4914]: E0130 21:56:26.819978 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:56:37 crc kubenswrapper[4914]: I0130 21:56:37.829077 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:56:37 crc kubenswrapper[4914]: E0130 21:56:37.830287 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:56:52 crc kubenswrapper[4914]: I0130 21:56:52.817839 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:56:52 crc kubenswrapper[4914]: E0130 21:56:52.818756 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:57:04 crc kubenswrapper[4914]: I0130 21:57:04.818607 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:57:04 crc kubenswrapper[4914]: E0130 21:57:04.819663 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:57:11 crc kubenswrapper[4914]: I0130 21:57:11.485782 4914 generic.go:334] "Generic (PLEG): container finished" podID="9490e581-cf4d-4139-a77f-5f2b790ea96b" containerID="a22fc195296bd9b664ec96ebfe264ab9b07f3485adda62db55bea3879c7e633d" exitCode=0 Jan 30 21:57:11 crc kubenswrapper[4914]: I0130 21:57:11.485882 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" event={"ID":"9490e581-cf4d-4139-a77f-5f2b790ea96b","Type":"ContainerDied","Data":"a22fc195296bd9b664ec96ebfe264ab9b07f3485adda62db55bea3879c7e633d"} Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.033111 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.140661 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-ssh-key-openstack-edpm-ipam\") pod \"9490e581-cf4d-4139-a77f-5f2b790ea96b\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.140806 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-extra-config-0\") pod \"9490e581-cf4d-4139-a77f-5f2b790ea96b\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.140848 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-combined-ca-bundle\") pod \"9490e581-cf4d-4139-a77f-5f2b790ea96b\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.140900 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-cell1-compute-config-0\") pod \"9490e581-cf4d-4139-a77f-5f2b790ea96b\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.141110 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-inventory\") pod \"9490e581-cf4d-4139-a77f-5f2b790ea96b\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.141231 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-migration-ssh-key-0\") pod \"9490e581-cf4d-4139-a77f-5f2b790ea96b\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.141294 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxxxj\" (UniqueName: \"kubernetes.io/projected/9490e581-cf4d-4139-a77f-5f2b790ea96b-kube-api-access-cxxxj\") pod \"9490e581-cf4d-4139-a77f-5f2b790ea96b\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.141345 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-cell1-compute-config-1\") pod \"9490e581-cf4d-4139-a77f-5f2b790ea96b\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.141475 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-migration-ssh-key-1\") pod \"9490e581-cf4d-4139-a77f-5f2b790ea96b\" (UID: \"9490e581-cf4d-4139-a77f-5f2b790ea96b\") " Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.146971 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9490e581-cf4d-4139-a77f-5f2b790ea96b-kube-api-access-cxxxj" (OuterVolumeSpecName: "kube-api-access-cxxxj") pod "9490e581-cf4d-4139-a77f-5f2b790ea96b" (UID: "9490e581-cf4d-4139-a77f-5f2b790ea96b"). InnerVolumeSpecName "kube-api-access-cxxxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.151852 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9490e581-cf4d-4139-a77f-5f2b790ea96b" (UID: "9490e581-cf4d-4139-a77f-5f2b790ea96b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.169896 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "9490e581-cf4d-4139-a77f-5f2b790ea96b" (UID: "9490e581-cf4d-4139-a77f-5f2b790ea96b"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.170619 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "9490e581-cf4d-4139-a77f-5f2b790ea96b" (UID: "9490e581-cf4d-4139-a77f-5f2b790ea96b"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.171150 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "9490e581-cf4d-4139-a77f-5f2b790ea96b" (UID: "9490e581-cf4d-4139-a77f-5f2b790ea96b"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.173095 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "9490e581-cf4d-4139-a77f-5f2b790ea96b" (UID: "9490e581-cf4d-4139-a77f-5f2b790ea96b"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.174938 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-inventory" (OuterVolumeSpecName: "inventory") pod "9490e581-cf4d-4139-a77f-5f2b790ea96b" (UID: "9490e581-cf4d-4139-a77f-5f2b790ea96b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.179876 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "9490e581-cf4d-4139-a77f-5f2b790ea96b" (UID: "9490e581-cf4d-4139-a77f-5f2b790ea96b"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.182484 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9490e581-cf4d-4139-a77f-5f2b790ea96b" (UID: "9490e581-cf4d-4139-a77f-5f2b790ea96b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.244359 4914 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.244492 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.244556 4914 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.244614 4914 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.244669 4914 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.244740 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.244879 4914 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.244934 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxxxj\" (UniqueName: \"kubernetes.io/projected/9490e581-cf4d-4139-a77f-5f2b790ea96b-kube-api-access-cxxxj\") on node \"crc\" DevicePath \"\"" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.245008 4914 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9490e581-cf4d-4139-a77f-5f2b790ea96b-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.505170 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" event={"ID":"9490e581-cf4d-4139-a77f-5f2b790ea96b","Type":"ContainerDied","Data":"13aae9b3d6a1497370e6c3961e09688781e6d9af90575dff3be9ab9edf8c0e58"} Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.505205 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13aae9b3d6a1497370e6c3961e09688781e6d9af90575dff3be9ab9edf8c0e58" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.505507 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zmw7z" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.626039 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg"] Jan 30 21:57:13 crc kubenswrapper[4914]: E0130 21:57:13.626513 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9490e581-cf4d-4139-a77f-5f2b790ea96b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.626529 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="9490e581-cf4d-4139-a77f-5f2b790ea96b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.626724 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="9490e581-cf4d-4139-a77f-5f2b790ea96b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.627480 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.631074 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pplqz" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.631260 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.631646 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.631663 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.631754 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.638376 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg"] Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.757562 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.757843 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.757891 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.757925 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zmjw\" (UniqueName: \"kubernetes.io/projected/294df817-0302-46c7-84cf-f300a188d47a-kube-api-access-4zmjw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.757952 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.757977 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.758011 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.859516 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zmjw\" (UniqueName: \"kubernetes.io/projected/294df817-0302-46c7-84cf-f300a188d47a-kube-api-access-4zmjw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.859571 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.859598 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.859632 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.859691 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.859804 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.859874 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.865365 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.865973 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.872293 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.872773 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.873978 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.876338 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.884427 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zmjw\" (UniqueName: \"kubernetes.io/projected/294df817-0302-46c7-84cf-f300a188d47a-kube-api-access-4zmjw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:13 crc kubenswrapper[4914]: I0130 21:57:13.947014 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:57:14 crc kubenswrapper[4914]: I0130 21:57:14.527209 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg"] Jan 30 21:57:14 crc kubenswrapper[4914]: I0130 21:57:14.528195 4914 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:57:15 crc kubenswrapper[4914]: I0130 21:57:15.538730 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" event={"ID":"294df817-0302-46c7-84cf-f300a188d47a","Type":"ContainerStarted","Data":"c5d2907b2da0214460828b10bfa90052df42ac983b77f5cde86bd11e6b83410d"} Jan 30 21:57:15 crc kubenswrapper[4914]: I0130 21:57:15.539054 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" event={"ID":"294df817-0302-46c7-84cf-f300a188d47a","Type":"ContainerStarted","Data":"5e400d2c976fac104b36a8a718a164584b82c603dfcbb806d7ad6f52fa02a897"} Jan 30 21:57:15 crc kubenswrapper[4914]: I0130 21:57:15.555420 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" podStartSLOduration=1.872564844 podStartE2EDuration="2.55540464s" podCreationTimestamp="2026-01-30 21:57:13 +0000 UTC" firstStartedPulling="2026-01-30 21:57:14.528013629 +0000 UTC m=+2567.966650390" lastFinishedPulling="2026-01-30 21:57:15.210853425 +0000 UTC m=+2568.649490186" observedRunningTime="2026-01-30 21:57:15.552740206 +0000 UTC m=+2568.991376967" watchObservedRunningTime="2026-01-30 21:57:15.55540464 +0000 UTC m=+2568.994041401" Jan 30 21:57:19 crc kubenswrapper[4914]: I0130 21:57:19.819362 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:57:19 crc kubenswrapper[4914]: E0130 21:57:19.820532 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:57:34 crc kubenswrapper[4914]: I0130 21:57:34.818178 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:57:34 crc kubenswrapper[4914]: E0130 21:57:34.818928 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:57:45 crc kubenswrapper[4914]: I0130 21:57:45.818127 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:57:45 crc kubenswrapper[4914]: E0130 21:57:45.818957 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:57:57 crc kubenswrapper[4914]: I0130 21:57:57.824276 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:57:57 crc kubenswrapper[4914]: E0130 21:57:57.824903 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:58:10 crc kubenswrapper[4914]: I0130 21:58:10.818566 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:58:10 crc kubenswrapper[4914]: E0130 21:58:10.819322 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:58:23 crc kubenswrapper[4914]: I0130 21:58:23.818196 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:58:23 crc kubenswrapper[4914]: E0130 21:58:23.819031 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:58:34 crc kubenswrapper[4914]: I0130 21:58:34.818988 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:58:34 crc kubenswrapper[4914]: E0130 21:58:34.820849 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:58:46 crc kubenswrapper[4914]: I0130 21:58:46.818205 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:58:46 crc kubenswrapper[4914]: E0130 21:58:46.819256 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:59:01 crc kubenswrapper[4914]: I0130 21:59:01.818587 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:59:01 crc kubenswrapper[4914]: E0130 21:59:01.820141 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:59:16 crc kubenswrapper[4914]: I0130 21:59:16.819129 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:59:16 crc kubenswrapper[4914]: E0130 21:59:16.820172 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 21:59:31 crc kubenswrapper[4914]: I0130 21:59:31.818581 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 21:59:32 crc kubenswrapper[4914]: I0130 21:59:32.859343 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"8c7d42f95932576854d92f120d17b18c9c987ff1795ac1cf8cef01b20f9ab65c"} Jan 30 21:59:33 crc kubenswrapper[4914]: I0130 21:59:33.870116 4914 generic.go:334] "Generic (PLEG): container finished" podID="294df817-0302-46c7-84cf-f300a188d47a" containerID="c5d2907b2da0214460828b10bfa90052df42ac983b77f5cde86bd11e6b83410d" exitCode=0 Jan 30 21:59:33 crc kubenswrapper[4914]: I0130 21:59:33.870201 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" event={"ID":"294df817-0302-46c7-84cf-f300a188d47a","Type":"ContainerDied","Data":"c5d2907b2da0214460828b10bfa90052df42ac983b77f5cde86bd11e6b83410d"} Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.407934 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.537055 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ssh-key-openstack-edpm-ipam\") pod \"294df817-0302-46c7-84cf-f300a188d47a\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.537104 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-1\") pod \"294df817-0302-46c7-84cf-f300a188d47a\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.537312 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-0\") pod \"294df817-0302-46c7-84cf-f300a188d47a\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.537356 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-2\") pod \"294df817-0302-46c7-84cf-f300a188d47a\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.537383 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-telemetry-combined-ca-bundle\") pod \"294df817-0302-46c7-84cf-f300a188d47a\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.537421 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zmjw\" (UniqueName: \"kubernetes.io/projected/294df817-0302-46c7-84cf-f300a188d47a-kube-api-access-4zmjw\") pod \"294df817-0302-46c7-84cf-f300a188d47a\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.537488 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-inventory\") pod \"294df817-0302-46c7-84cf-f300a188d47a\" (UID: \"294df817-0302-46c7-84cf-f300a188d47a\") " Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.547464 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "294df817-0302-46c7-84cf-f300a188d47a" (UID: "294df817-0302-46c7-84cf-f300a188d47a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.559798 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/294df817-0302-46c7-84cf-f300a188d47a-kube-api-access-4zmjw" (OuterVolumeSpecName: "kube-api-access-4zmjw") pod "294df817-0302-46c7-84cf-f300a188d47a" (UID: "294df817-0302-46c7-84cf-f300a188d47a"). InnerVolumeSpecName "kube-api-access-4zmjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.567980 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "294df817-0302-46c7-84cf-f300a188d47a" (UID: "294df817-0302-46c7-84cf-f300a188d47a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.570976 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "294df817-0302-46c7-84cf-f300a188d47a" (UID: "294df817-0302-46c7-84cf-f300a188d47a"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.571399 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "294df817-0302-46c7-84cf-f300a188d47a" (UID: "294df817-0302-46c7-84cf-f300a188d47a"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.574400 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-inventory" (OuterVolumeSpecName: "inventory") pod "294df817-0302-46c7-84cf-f300a188d47a" (UID: "294df817-0302-46c7-84cf-f300a188d47a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.578267 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "294df817-0302-46c7-84cf-f300a188d47a" (UID: "294df817-0302-46c7-84cf-f300a188d47a"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.639913 4914 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.639958 4914 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.639993 4914 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.640012 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zmjw\" (UniqueName: \"kubernetes.io/projected/294df817-0302-46c7-84cf-f300a188d47a-kube-api-access-4zmjw\") on node \"crc\" DevicePath \"\"" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.640025 4914 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.640035 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.640047 4914 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/294df817-0302-46c7-84cf-f300a188d47a-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.891225 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" event={"ID":"294df817-0302-46c7-84cf-f300a188d47a","Type":"ContainerDied","Data":"5e400d2c976fac104b36a8a718a164584b82c603dfcbb806d7ad6f52fa02a897"} Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.891622 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e400d2c976fac104b36a8a718a164584b82c603dfcbb806d7ad6f52fa02a897" Jan 30 21:59:35 crc kubenswrapper[4914]: I0130 21:59:35.891276 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.152847 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk"] Jan 30 22:00:00 crc kubenswrapper[4914]: E0130 22:00:00.154341 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="294df817-0302-46c7-84cf-f300a188d47a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.154365 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="294df817-0302-46c7-84cf-f300a188d47a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.155079 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="294df817-0302-46c7-84cf-f300a188d47a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.156309 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.158434 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.158615 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.169740 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk"] Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.327354 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7946508-01dc-4371-95b3-d231fa850fe8-secret-volume\") pod \"collect-profiles-29496840-gslnk\" (UID: \"f7946508-01dc-4371-95b3-d231fa850fe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.327416 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7946508-01dc-4371-95b3-d231fa850fe8-config-volume\") pod \"collect-profiles-29496840-gslnk\" (UID: \"f7946508-01dc-4371-95b3-d231fa850fe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.328044 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-597ts\" (UniqueName: \"kubernetes.io/projected/f7946508-01dc-4371-95b3-d231fa850fe8-kube-api-access-597ts\") pod \"collect-profiles-29496840-gslnk\" (UID: \"f7946508-01dc-4371-95b3-d231fa850fe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.430838 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7946508-01dc-4371-95b3-d231fa850fe8-secret-volume\") pod \"collect-profiles-29496840-gslnk\" (UID: \"f7946508-01dc-4371-95b3-d231fa850fe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.430929 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7946508-01dc-4371-95b3-d231fa850fe8-config-volume\") pod \"collect-profiles-29496840-gslnk\" (UID: \"f7946508-01dc-4371-95b3-d231fa850fe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.431085 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-597ts\" (UniqueName: \"kubernetes.io/projected/f7946508-01dc-4371-95b3-d231fa850fe8-kube-api-access-597ts\") pod \"collect-profiles-29496840-gslnk\" (UID: \"f7946508-01dc-4371-95b3-d231fa850fe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.433115 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7946508-01dc-4371-95b3-d231fa850fe8-config-volume\") pod \"collect-profiles-29496840-gslnk\" (UID: \"f7946508-01dc-4371-95b3-d231fa850fe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.439148 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7946508-01dc-4371-95b3-d231fa850fe8-secret-volume\") pod \"collect-profiles-29496840-gslnk\" (UID: \"f7946508-01dc-4371-95b3-d231fa850fe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.448742 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-597ts\" (UniqueName: \"kubernetes.io/projected/f7946508-01dc-4371-95b3-d231fa850fe8-kube-api-access-597ts\") pod \"collect-profiles-29496840-gslnk\" (UID: \"f7946508-01dc-4371-95b3-d231fa850fe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" Jan 30 22:00:00 crc kubenswrapper[4914]: I0130 22:00:00.489267 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" Jan 30 22:00:01 crc kubenswrapper[4914]: I0130 22:00:01.029456 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk"] Jan 30 22:00:01 crc kubenswrapper[4914]: I0130 22:00:01.124609 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" event={"ID":"f7946508-01dc-4371-95b3-d231fa850fe8","Type":"ContainerStarted","Data":"a6d043da8880e652c98542ba1b1254f3fde5a5e749a810a02918fbda32f47ea8"} Jan 30 22:00:02 crc kubenswrapper[4914]: I0130 22:00:02.137964 4914 generic.go:334] "Generic (PLEG): container finished" podID="f7946508-01dc-4371-95b3-d231fa850fe8" containerID="c7d0d65fc5b5af940c4891a7f1fc1190a1b2b6982a60c8f027e13f62ba13d119" exitCode=0 Jan 30 22:00:02 crc kubenswrapper[4914]: I0130 22:00:02.138072 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" event={"ID":"f7946508-01dc-4371-95b3-d231fa850fe8","Type":"ContainerDied","Data":"c7d0d65fc5b5af940c4891a7f1fc1190a1b2b6982a60c8f027e13f62ba13d119"} Jan 30 22:00:03 crc kubenswrapper[4914]: I0130 22:00:03.657616 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" Jan 30 22:00:03 crc kubenswrapper[4914]: I0130 22:00:03.818962 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7946508-01dc-4371-95b3-d231fa850fe8-config-volume\") pod \"f7946508-01dc-4371-95b3-d231fa850fe8\" (UID: \"f7946508-01dc-4371-95b3-d231fa850fe8\") " Jan 30 22:00:03 crc kubenswrapper[4914]: I0130 22:00:03.819276 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7946508-01dc-4371-95b3-d231fa850fe8-secret-volume\") pod \"f7946508-01dc-4371-95b3-d231fa850fe8\" (UID: \"f7946508-01dc-4371-95b3-d231fa850fe8\") " Jan 30 22:00:03 crc kubenswrapper[4914]: I0130 22:00:03.819358 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-597ts\" (UniqueName: \"kubernetes.io/projected/f7946508-01dc-4371-95b3-d231fa850fe8-kube-api-access-597ts\") pod \"f7946508-01dc-4371-95b3-d231fa850fe8\" (UID: \"f7946508-01dc-4371-95b3-d231fa850fe8\") " Jan 30 22:00:03 crc kubenswrapper[4914]: I0130 22:00:03.819681 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7946508-01dc-4371-95b3-d231fa850fe8-config-volume" (OuterVolumeSpecName: "config-volume") pod "f7946508-01dc-4371-95b3-d231fa850fe8" (UID: "f7946508-01dc-4371-95b3-d231fa850fe8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 22:00:03 crc kubenswrapper[4914]: I0130 22:00:03.819924 4914 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7946508-01dc-4371-95b3-d231fa850fe8-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 22:00:03 crc kubenswrapper[4914]: I0130 22:00:03.827833 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7946508-01dc-4371-95b3-d231fa850fe8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f7946508-01dc-4371-95b3-d231fa850fe8" (UID: "f7946508-01dc-4371-95b3-d231fa850fe8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:00:03 crc kubenswrapper[4914]: I0130 22:00:03.834165 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7946508-01dc-4371-95b3-d231fa850fe8-kube-api-access-597ts" (OuterVolumeSpecName: "kube-api-access-597ts") pod "f7946508-01dc-4371-95b3-d231fa850fe8" (UID: "f7946508-01dc-4371-95b3-d231fa850fe8"). InnerVolumeSpecName "kube-api-access-597ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:00:03 crc kubenswrapper[4914]: I0130 22:00:03.921864 4914 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7946508-01dc-4371-95b3-d231fa850fe8-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 22:00:03 crc kubenswrapper[4914]: I0130 22:00:03.921905 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-597ts\" (UniqueName: \"kubernetes.io/projected/f7946508-01dc-4371-95b3-d231fa850fe8-kube-api-access-597ts\") on node \"crc\" DevicePath \"\"" Jan 30 22:00:04 crc kubenswrapper[4914]: I0130 22:00:04.160791 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" event={"ID":"f7946508-01dc-4371-95b3-d231fa850fe8","Type":"ContainerDied","Data":"a6d043da8880e652c98542ba1b1254f3fde5a5e749a810a02918fbda32f47ea8"} Jan 30 22:00:04 crc kubenswrapper[4914]: I0130 22:00:04.160841 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6d043da8880e652c98542ba1b1254f3fde5a5e749a810a02918fbda32f47ea8" Jan 30 22:00:04 crc kubenswrapper[4914]: I0130 22:00:04.160868 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-gslnk" Jan 30 22:00:04 crc kubenswrapper[4914]: I0130 22:00:04.738086 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd"] Jan 30 22:00:04 crc kubenswrapper[4914]: I0130 22:00:04.749527 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496795-pwhjd"] Jan 30 22:00:05 crc kubenswrapper[4914]: I0130 22:00:05.831642 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e050cbd0-653b-4d23-8a69-affa52be9608" path="/var/lib/kubelet/pods/e050cbd0-653b-4d23-8a69-affa52be9608/volumes" Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.351733 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mq4s8"] Jan 30 22:00:14 crc kubenswrapper[4914]: E0130 22:00:14.352824 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7946508-01dc-4371-95b3-d231fa850fe8" containerName="collect-profiles" Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.352844 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7946508-01dc-4371-95b3-d231fa850fe8" containerName="collect-profiles" Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.353142 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7946508-01dc-4371-95b3-d231fa850fe8" containerName="collect-profiles" Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.355460 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.364781 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mq4s8"] Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.479954 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b272b73-2eee-4d7e-bcba-e83d2e008636-catalog-content\") pod \"community-operators-mq4s8\" (UID: \"1b272b73-2eee-4d7e-bcba-e83d2e008636\") " pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.480094 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b272b73-2eee-4d7e-bcba-e83d2e008636-utilities\") pod \"community-operators-mq4s8\" (UID: \"1b272b73-2eee-4d7e-bcba-e83d2e008636\") " pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.480305 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7k5v\" (UniqueName: \"kubernetes.io/projected/1b272b73-2eee-4d7e-bcba-e83d2e008636-kube-api-access-b7k5v\") pod \"community-operators-mq4s8\" (UID: \"1b272b73-2eee-4d7e-bcba-e83d2e008636\") " pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.582669 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7k5v\" (UniqueName: \"kubernetes.io/projected/1b272b73-2eee-4d7e-bcba-e83d2e008636-kube-api-access-b7k5v\") pod \"community-operators-mq4s8\" (UID: \"1b272b73-2eee-4d7e-bcba-e83d2e008636\") " pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.582787 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b272b73-2eee-4d7e-bcba-e83d2e008636-catalog-content\") pod \"community-operators-mq4s8\" (UID: \"1b272b73-2eee-4d7e-bcba-e83d2e008636\") " pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.582845 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b272b73-2eee-4d7e-bcba-e83d2e008636-utilities\") pod \"community-operators-mq4s8\" (UID: \"1b272b73-2eee-4d7e-bcba-e83d2e008636\") " pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.583408 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b272b73-2eee-4d7e-bcba-e83d2e008636-utilities\") pod \"community-operators-mq4s8\" (UID: \"1b272b73-2eee-4d7e-bcba-e83d2e008636\") " pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.583447 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b272b73-2eee-4d7e-bcba-e83d2e008636-catalog-content\") pod \"community-operators-mq4s8\" (UID: \"1b272b73-2eee-4d7e-bcba-e83d2e008636\") " pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.604029 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7k5v\" (UniqueName: \"kubernetes.io/projected/1b272b73-2eee-4d7e-bcba-e83d2e008636-kube-api-access-b7k5v\") pod \"community-operators-mq4s8\" (UID: \"1b272b73-2eee-4d7e-bcba-e83d2e008636\") " pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:14 crc kubenswrapper[4914]: I0130 22:00:14.678907 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:15 crc kubenswrapper[4914]: I0130 22:00:15.253248 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mq4s8"] Jan 30 22:00:15 crc kubenswrapper[4914]: I0130 22:00:15.277585 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mq4s8" event={"ID":"1b272b73-2eee-4d7e-bcba-e83d2e008636","Type":"ContainerStarted","Data":"c16f0aec06e0c69ebebd3ccf7b9f840e4479d4345ca39fe30cd13bf846c65ce7"} Jan 30 22:00:16 crc kubenswrapper[4914]: I0130 22:00:16.288552 4914 generic.go:334] "Generic (PLEG): container finished" podID="1b272b73-2eee-4d7e-bcba-e83d2e008636" containerID="40e8092d9451e244f551d685361f161e5b71fd06960d62eeb79196808c76a542" exitCode=0 Jan 30 22:00:16 crc kubenswrapper[4914]: I0130 22:00:16.288841 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mq4s8" event={"ID":"1b272b73-2eee-4d7e-bcba-e83d2e008636","Type":"ContainerDied","Data":"40e8092d9451e244f551d685361f161e5b71fd06960d62eeb79196808c76a542"} Jan 30 22:00:18 crc kubenswrapper[4914]: I0130 22:00:18.311432 4914 generic.go:334] "Generic (PLEG): container finished" podID="1b272b73-2eee-4d7e-bcba-e83d2e008636" containerID="df15aecd8751ffa0a23bacd9ce82177463a31268bea82e51b9e86550348b2695" exitCode=0 Jan 30 22:00:18 crc kubenswrapper[4914]: I0130 22:00:18.311489 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mq4s8" event={"ID":"1b272b73-2eee-4d7e-bcba-e83d2e008636","Type":"ContainerDied","Data":"df15aecd8751ffa0a23bacd9ce82177463a31268bea82e51b9e86550348b2695"} Jan 30 22:00:21 crc kubenswrapper[4914]: I0130 22:00:21.341968 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mq4s8" event={"ID":"1b272b73-2eee-4d7e-bcba-e83d2e008636","Type":"ContainerStarted","Data":"5c579521757fb9557fb3901e4ed2478d014ca9f9a32c23b66695521da693e652"} Jan 30 22:00:21 crc kubenswrapper[4914]: I0130 22:00:21.365896 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mq4s8" podStartSLOduration=2.771189594 podStartE2EDuration="7.365880312s" podCreationTimestamp="2026-01-30 22:00:14 +0000 UTC" firstStartedPulling="2026-01-30 22:00:16.291130236 +0000 UTC m=+2749.729766987" lastFinishedPulling="2026-01-30 22:00:20.885820944 +0000 UTC m=+2754.324457705" observedRunningTime="2026-01-30 22:00:21.357385579 +0000 UTC m=+2754.796022340" watchObservedRunningTime="2026-01-30 22:00:21.365880312 +0000 UTC m=+2754.804517063" Jan 30 22:00:24 crc kubenswrapper[4914]: I0130 22:00:24.679262 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:24 crc kubenswrapper[4914]: I0130 22:00:24.679999 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:24 crc kubenswrapper[4914]: I0130 22:00:24.736395 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:34 crc kubenswrapper[4914]: I0130 22:00:34.742600 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:34 crc kubenswrapper[4914]: I0130 22:00:34.811022 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mq4s8"] Jan 30 22:00:35 crc kubenswrapper[4914]: I0130 22:00:35.484413 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mq4s8" podUID="1b272b73-2eee-4d7e-bcba-e83d2e008636" containerName="registry-server" containerID="cri-o://5c579521757fb9557fb3901e4ed2478d014ca9f9a32c23b66695521da693e652" gracePeriod=2 Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.069737 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.080009 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b272b73-2eee-4d7e-bcba-e83d2e008636-utilities\") pod \"1b272b73-2eee-4d7e-bcba-e83d2e008636\" (UID: \"1b272b73-2eee-4d7e-bcba-e83d2e008636\") " Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.080174 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b272b73-2eee-4d7e-bcba-e83d2e008636-catalog-content\") pod \"1b272b73-2eee-4d7e-bcba-e83d2e008636\" (UID: \"1b272b73-2eee-4d7e-bcba-e83d2e008636\") " Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.080281 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7k5v\" (UniqueName: \"kubernetes.io/projected/1b272b73-2eee-4d7e-bcba-e83d2e008636-kube-api-access-b7k5v\") pod \"1b272b73-2eee-4d7e-bcba-e83d2e008636\" (UID: \"1b272b73-2eee-4d7e-bcba-e83d2e008636\") " Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.080920 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b272b73-2eee-4d7e-bcba-e83d2e008636-utilities" (OuterVolumeSpecName: "utilities") pod "1b272b73-2eee-4d7e-bcba-e83d2e008636" (UID: "1b272b73-2eee-4d7e-bcba-e83d2e008636"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.089327 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b272b73-2eee-4d7e-bcba-e83d2e008636-kube-api-access-b7k5v" (OuterVolumeSpecName: "kube-api-access-b7k5v") pod "1b272b73-2eee-4d7e-bcba-e83d2e008636" (UID: "1b272b73-2eee-4d7e-bcba-e83d2e008636"). InnerVolumeSpecName "kube-api-access-b7k5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.141923 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b272b73-2eee-4d7e-bcba-e83d2e008636-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b272b73-2eee-4d7e-bcba-e83d2e008636" (UID: "1b272b73-2eee-4d7e-bcba-e83d2e008636"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.181938 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b272b73-2eee-4d7e-bcba-e83d2e008636-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.181971 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7k5v\" (UniqueName: \"kubernetes.io/projected/1b272b73-2eee-4d7e-bcba-e83d2e008636-kube-api-access-b7k5v\") on node \"crc\" DevicePath \"\"" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.181986 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b272b73-2eee-4d7e-bcba-e83d2e008636-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.498098 4914 generic.go:334] "Generic (PLEG): container finished" podID="1b272b73-2eee-4d7e-bcba-e83d2e008636" containerID="5c579521757fb9557fb3901e4ed2478d014ca9f9a32c23b66695521da693e652" exitCode=0 Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.498466 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mq4s8" event={"ID":"1b272b73-2eee-4d7e-bcba-e83d2e008636","Type":"ContainerDied","Data":"5c579521757fb9557fb3901e4ed2478d014ca9f9a32c23b66695521da693e652"} Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.498494 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mq4s8" event={"ID":"1b272b73-2eee-4d7e-bcba-e83d2e008636","Type":"ContainerDied","Data":"c16f0aec06e0c69ebebd3ccf7b9f840e4479d4345ca39fe30cd13bf846c65ce7"} Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.498512 4914 scope.go:117] "RemoveContainer" containerID="5c579521757fb9557fb3901e4ed2478d014ca9f9a32c23b66695521da693e652" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.498646 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mq4s8" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.525185 4914 scope.go:117] "RemoveContainer" containerID="df15aecd8751ffa0a23bacd9ce82177463a31268bea82e51b9e86550348b2695" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.539045 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mq4s8"] Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.547892 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mq4s8"] Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.567069 4914 scope.go:117] "RemoveContainer" containerID="40e8092d9451e244f551d685361f161e5b71fd06960d62eeb79196808c76a542" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.599425 4914 scope.go:117] "RemoveContainer" containerID="5c579521757fb9557fb3901e4ed2478d014ca9f9a32c23b66695521da693e652" Jan 30 22:00:36 crc kubenswrapper[4914]: E0130 22:00:36.600467 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c579521757fb9557fb3901e4ed2478d014ca9f9a32c23b66695521da693e652\": container with ID starting with 5c579521757fb9557fb3901e4ed2478d014ca9f9a32c23b66695521da693e652 not found: ID does not exist" containerID="5c579521757fb9557fb3901e4ed2478d014ca9f9a32c23b66695521da693e652" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.600517 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c579521757fb9557fb3901e4ed2478d014ca9f9a32c23b66695521da693e652"} err="failed to get container status \"5c579521757fb9557fb3901e4ed2478d014ca9f9a32c23b66695521da693e652\": rpc error: code = NotFound desc = could not find container \"5c579521757fb9557fb3901e4ed2478d014ca9f9a32c23b66695521da693e652\": container with ID starting with 5c579521757fb9557fb3901e4ed2478d014ca9f9a32c23b66695521da693e652 not found: ID does not exist" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.600549 4914 scope.go:117] "RemoveContainer" containerID="df15aecd8751ffa0a23bacd9ce82177463a31268bea82e51b9e86550348b2695" Jan 30 22:00:36 crc kubenswrapper[4914]: E0130 22:00:36.601088 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df15aecd8751ffa0a23bacd9ce82177463a31268bea82e51b9e86550348b2695\": container with ID starting with df15aecd8751ffa0a23bacd9ce82177463a31268bea82e51b9e86550348b2695 not found: ID does not exist" containerID="df15aecd8751ffa0a23bacd9ce82177463a31268bea82e51b9e86550348b2695" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.601130 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df15aecd8751ffa0a23bacd9ce82177463a31268bea82e51b9e86550348b2695"} err="failed to get container status \"df15aecd8751ffa0a23bacd9ce82177463a31268bea82e51b9e86550348b2695\": rpc error: code = NotFound desc = could not find container \"df15aecd8751ffa0a23bacd9ce82177463a31268bea82e51b9e86550348b2695\": container with ID starting with df15aecd8751ffa0a23bacd9ce82177463a31268bea82e51b9e86550348b2695 not found: ID does not exist" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.601167 4914 scope.go:117] "RemoveContainer" containerID="40e8092d9451e244f551d685361f161e5b71fd06960d62eeb79196808c76a542" Jan 30 22:00:36 crc kubenswrapper[4914]: E0130 22:00:36.601517 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e8092d9451e244f551d685361f161e5b71fd06960d62eeb79196808c76a542\": container with ID starting with 40e8092d9451e244f551d685361f161e5b71fd06960d62eeb79196808c76a542 not found: ID does not exist" containerID="40e8092d9451e244f551d685361f161e5b71fd06960d62eeb79196808c76a542" Jan 30 22:00:36 crc kubenswrapper[4914]: I0130 22:00:36.601548 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e8092d9451e244f551d685361f161e5b71fd06960d62eeb79196808c76a542"} err="failed to get container status \"40e8092d9451e244f551d685361f161e5b71fd06960d62eeb79196808c76a542\": rpc error: code = NotFound desc = could not find container \"40e8092d9451e244f551d685361f161e5b71fd06960d62eeb79196808c76a542\": container with ID starting with 40e8092d9451e244f551d685361f161e5b71fd06960d62eeb79196808c76a542 not found: ID does not exist" Jan 30 22:00:37 crc kubenswrapper[4914]: I0130 22:00:37.834097 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b272b73-2eee-4d7e-bcba-e83d2e008636" path="/var/lib/kubelet/pods/1b272b73-2eee-4d7e-bcba-e83d2e008636/volumes" Jan 30 22:00:44 crc kubenswrapper[4914]: I0130 22:00:44.385667 4914 scope.go:117] "RemoveContainer" containerID="1c5e4d37c10ce7fb0002fa4c617f9dd7f53d4300c4be72db9aa287a9f0ecb40d" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.735930 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9sm9n"] Jan 30 22:00:46 crc kubenswrapper[4914]: E0130 22:00:46.736956 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b272b73-2eee-4d7e-bcba-e83d2e008636" containerName="registry-server" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.736974 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b272b73-2eee-4d7e-bcba-e83d2e008636" containerName="registry-server" Jan 30 22:00:46 crc kubenswrapper[4914]: E0130 22:00:46.737012 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b272b73-2eee-4d7e-bcba-e83d2e008636" containerName="extract-content" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.737021 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b272b73-2eee-4d7e-bcba-e83d2e008636" containerName="extract-content" Jan 30 22:00:46 crc kubenswrapper[4914]: E0130 22:00:46.737037 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b272b73-2eee-4d7e-bcba-e83d2e008636" containerName="extract-utilities" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.737046 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b272b73-2eee-4d7e-bcba-e83d2e008636" containerName="extract-utilities" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.737291 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b272b73-2eee-4d7e-bcba-e83d2e008636" containerName="registry-server" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.739186 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.750072 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9sm9n"] Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.806483 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gsvj\" (UniqueName: \"kubernetes.io/projected/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-kube-api-access-8gsvj\") pod \"redhat-operators-9sm9n\" (UID: \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\") " pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.806654 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-utilities\") pod \"redhat-operators-9sm9n\" (UID: \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\") " pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.806914 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-catalog-content\") pod \"redhat-operators-9sm9n\" (UID: \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\") " pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.908854 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gsvj\" (UniqueName: \"kubernetes.io/projected/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-kube-api-access-8gsvj\") pod \"redhat-operators-9sm9n\" (UID: \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\") " pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.908937 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-utilities\") pod \"redhat-operators-9sm9n\" (UID: \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\") " pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.909097 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-catalog-content\") pod \"redhat-operators-9sm9n\" (UID: \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\") " pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.909780 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-catalog-content\") pod \"redhat-operators-9sm9n\" (UID: \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\") " pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.910347 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-utilities\") pod \"redhat-operators-9sm9n\" (UID: \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\") " pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:00:46 crc kubenswrapper[4914]: I0130 22:00:46.932737 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gsvj\" (UniqueName: \"kubernetes.io/projected/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-kube-api-access-8gsvj\") pod \"redhat-operators-9sm9n\" (UID: \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\") " pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:00:47 crc kubenswrapper[4914]: I0130 22:00:47.065174 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:00:47 crc kubenswrapper[4914]: I0130 22:00:47.527984 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9sm9n"] Jan 30 22:00:47 crc kubenswrapper[4914]: I0130 22:00:47.609475 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sm9n" event={"ID":"b4c86b39-743b-4e2f-b07d-26a46d2f55ec","Type":"ContainerStarted","Data":"097e11028a2812d64ef93fce0fd90f9ec201ae7dd387adf98a28f3b69cec9694"} Jan 30 22:00:48 crc kubenswrapper[4914]: I0130 22:00:48.619512 4914 generic.go:334] "Generic (PLEG): container finished" podID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerID="e1403d7a9202ad97d108a5584f661c4d575ecfedfe51bd79f5379f71476fde24" exitCode=0 Jan 30 22:00:48 crc kubenswrapper[4914]: I0130 22:00:48.619580 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sm9n" event={"ID":"b4c86b39-743b-4e2f-b07d-26a46d2f55ec","Type":"ContainerDied","Data":"e1403d7a9202ad97d108a5584f661c4d575ecfedfe51bd79f5379f71476fde24"} Jan 30 22:00:50 crc kubenswrapper[4914]: I0130 22:00:50.638972 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sm9n" event={"ID":"b4c86b39-743b-4e2f-b07d-26a46d2f55ec","Type":"ContainerStarted","Data":"f48afbf95e9aced1248f9cf8c5379a6a97c97df8adbdaa3605e088b149c2fbf8"} Jan 30 22:00:57 crc kubenswrapper[4914]: I0130 22:00:57.710153 4914 generic.go:334] "Generic (PLEG): container finished" podID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerID="f48afbf95e9aced1248f9cf8c5379a6a97c97df8adbdaa3605e088b149c2fbf8" exitCode=0 Jan 30 22:00:57 crc kubenswrapper[4914]: I0130 22:00:57.710897 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sm9n" event={"ID":"b4c86b39-743b-4e2f-b07d-26a46d2f55ec","Type":"ContainerDied","Data":"f48afbf95e9aced1248f9cf8c5379a6a97c97df8adbdaa3605e088b149c2fbf8"} Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.147513 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29496841-5k68x"] Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.149350 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.229322 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-fernet-keys\") pod \"keystone-cron-29496841-5k68x\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.229389 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49d77\" (UniqueName: \"kubernetes.io/projected/1804d838-ccaf-46f1-a848-81790716a2f4-kube-api-access-49d77\") pod \"keystone-cron-29496841-5k68x\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.229833 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-config-data\") pod \"keystone-cron-29496841-5k68x\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.229978 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-combined-ca-bundle\") pod \"keystone-cron-29496841-5k68x\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.331301 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-config-data\") pod \"keystone-cron-29496841-5k68x\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.331443 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-combined-ca-bundle\") pod \"keystone-cron-29496841-5k68x\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.331492 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-fernet-keys\") pod \"keystone-cron-29496841-5k68x\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.331519 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49d77\" (UniqueName: \"kubernetes.io/projected/1804d838-ccaf-46f1-a848-81790716a2f4-kube-api-access-49d77\") pod \"keystone-cron-29496841-5k68x\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.346359 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-config-data\") pod \"keystone-cron-29496841-5k68x\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.347646 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-combined-ca-bundle\") pod \"keystone-cron-29496841-5k68x\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.349608 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-fernet-keys\") pod \"keystone-cron-29496841-5k68x\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.370328 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49d77\" (UniqueName: \"kubernetes.io/projected/1804d838-ccaf-46f1-a848-81790716a2f4-kube-api-access-49d77\") pod \"keystone-cron-29496841-5k68x\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.412410 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29496841-5k68x"] Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.469298 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:00 crc kubenswrapper[4914]: I0130 22:01:00.986564 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29496841-5k68x"] Jan 30 22:01:01 crc kubenswrapper[4914]: I0130 22:01:01.748828 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496841-5k68x" event={"ID":"1804d838-ccaf-46f1-a848-81790716a2f4","Type":"ContainerStarted","Data":"264981cd27293679fc102f7e7474dc61b21a877007d9483e41704f8fa6c886db"} Jan 30 22:01:01 crc kubenswrapper[4914]: I0130 22:01:01.749171 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496841-5k68x" event={"ID":"1804d838-ccaf-46f1-a848-81790716a2f4","Type":"ContainerStarted","Data":"83be5752023aa59f27ddcc8e984aaa486dc1bcabd9e4b00a1808015b3ee5be95"} Jan 30 22:01:02 crc kubenswrapper[4914]: I0130 22:01:02.783207 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29496841-5k68x" podStartSLOduration=2.783185116 podStartE2EDuration="2.783185116s" podCreationTimestamp="2026-01-30 22:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 22:01:02.773770789 +0000 UTC m=+2796.212407550" watchObservedRunningTime="2026-01-30 22:01:02.783185116 +0000 UTC m=+2796.221821877" Jan 30 22:01:03 crc kubenswrapper[4914]: I0130 22:01:03.699046 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mxmlp"] Jan 30 22:01:03 crc kubenswrapper[4914]: I0130 22:01:03.701717 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:01:03 crc kubenswrapper[4914]: I0130 22:01:03.725551 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxmlp"] Jan 30 22:01:03 crc kubenswrapper[4914]: I0130 22:01:03.815600 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-catalog-content\") pod \"redhat-marketplace-mxmlp\" (UID: \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\") " pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:01:03 crc kubenswrapper[4914]: I0130 22:01:03.815854 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-utilities\") pod \"redhat-marketplace-mxmlp\" (UID: \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\") " pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:01:03 crc kubenswrapper[4914]: I0130 22:01:03.815890 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsr9t\" (UniqueName: \"kubernetes.io/projected/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-kube-api-access-gsr9t\") pod \"redhat-marketplace-mxmlp\" (UID: \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\") " pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:01:03 crc kubenswrapper[4914]: I0130 22:01:03.918981 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-catalog-content\") pod \"redhat-marketplace-mxmlp\" (UID: \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\") " pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:01:03 crc kubenswrapper[4914]: I0130 22:01:03.919185 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-utilities\") pod \"redhat-marketplace-mxmlp\" (UID: \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\") " pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:01:03 crc kubenswrapper[4914]: I0130 22:01:03.919224 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsr9t\" (UniqueName: \"kubernetes.io/projected/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-kube-api-access-gsr9t\") pod \"redhat-marketplace-mxmlp\" (UID: \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\") " pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:01:03 crc kubenswrapper[4914]: I0130 22:01:03.920191 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-catalog-content\") pod \"redhat-marketplace-mxmlp\" (UID: \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\") " pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:01:03 crc kubenswrapper[4914]: I0130 22:01:03.920250 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-utilities\") pod \"redhat-marketplace-mxmlp\" (UID: \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\") " pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:01:03 crc kubenswrapper[4914]: I0130 22:01:03.941080 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsr9t\" (UniqueName: \"kubernetes.io/projected/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-kube-api-access-gsr9t\") pod \"redhat-marketplace-mxmlp\" (UID: \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\") " pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:01:04 crc kubenswrapper[4914]: I0130 22:01:04.040976 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:01:04 crc kubenswrapper[4914]: I0130 22:01:04.595211 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxmlp"] Jan 30 22:01:04 crc kubenswrapper[4914]: I0130 22:01:04.780982 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sm9n" event={"ID":"b4c86b39-743b-4e2f-b07d-26a46d2f55ec","Type":"ContainerStarted","Data":"09e34b32161ba3c16082216f9a02c9739b0ffbc02054e08b289e3f789c81cd10"} Jan 30 22:01:04 crc kubenswrapper[4914]: I0130 22:01:04.783303 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxmlp" event={"ID":"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85","Type":"ContainerStarted","Data":"9e1176ba960c6d00ccd77d9053be5abdffbeec01cf8f8dcfd0217203e96b7a5a"} Jan 30 22:01:05 crc kubenswrapper[4914]: I0130 22:01:05.796150 4914 generic.go:334] "Generic (PLEG): container finished" podID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerID="87e59c43ac16d60af7ae1fa2bbb959eb29164f595f296ee80d503422499e6795" exitCode=0 Jan 30 22:01:05 crc kubenswrapper[4914]: I0130 22:01:05.796222 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxmlp" event={"ID":"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85","Type":"ContainerDied","Data":"87e59c43ac16d60af7ae1fa2bbb959eb29164f595f296ee80d503422499e6795"} Jan 30 22:01:05 crc kubenswrapper[4914]: I0130 22:01:05.851128 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9sm9n" podStartSLOduration=4.583585549 podStartE2EDuration="19.851101179s" podCreationTimestamp="2026-01-30 22:00:46 +0000 UTC" firstStartedPulling="2026-01-30 22:00:48.622534068 +0000 UTC m=+2782.061170829" lastFinishedPulling="2026-01-30 22:01:03.890049698 +0000 UTC m=+2797.328686459" observedRunningTime="2026-01-30 22:01:05.84780339 +0000 UTC m=+2799.286440151" watchObservedRunningTime="2026-01-30 22:01:05.851101179 +0000 UTC m=+2799.289737940" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.104326 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.107423 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.110766 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.111228 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.111313 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.112072 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-p8sjb" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.115972 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.177679 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.179402 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.179637 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.179781 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.180113 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-config-data\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.180216 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.180369 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.180469 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grzwk\" (UniqueName: \"kubernetes.io/projected/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-kube-api-access-grzwk\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.180568 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.282072 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-config-data\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.282165 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.282245 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.282271 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grzwk\" (UniqueName: \"kubernetes.io/projected/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-kube-api-access-grzwk\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.282304 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.282343 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.282365 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.282446 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.282500 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.283213 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.283476 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.283503 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.284461 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-config-data\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.285329 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.290218 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.291804 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.292381 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.310898 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grzwk\" (UniqueName: \"kubernetes.io/projected/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-kube-api-access-grzwk\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.362734 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " pod="openstack/tempest-tests-tempest" Jan 30 22:01:06 crc kubenswrapper[4914]: I0130 22:01:06.444464 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 22:01:07 crc kubenswrapper[4914]: I0130 22:01:07.001683 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 30 22:01:07 crc kubenswrapper[4914]: W0130 22:01:07.008736 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3aa1ebbf_1b7b_416b_8fe7_45c318d1d0a0.slice/crio-9ea5d9213ad53bb1e26a334d43c3c8ad33e3df6f003a0b16d8d5592cc7ef4f52 WatchSource:0}: Error finding container 9ea5d9213ad53bb1e26a334d43c3c8ad33e3df6f003a0b16d8d5592cc7ef4f52: Status 404 returned error can't find the container with id 9ea5d9213ad53bb1e26a334d43c3c8ad33e3df6f003a0b16d8d5592cc7ef4f52 Jan 30 22:01:07 crc kubenswrapper[4914]: I0130 22:01:07.066859 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:01:07 crc kubenswrapper[4914]: I0130 22:01:07.066938 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:01:07 crc kubenswrapper[4914]: I0130 22:01:07.842012 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxmlp" event={"ID":"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85","Type":"ContainerStarted","Data":"0589fd50231b5bd0ee45b13e33f412e38d01e319d7d7142a4476d1014415e3f3"} Jan 30 22:01:07 crc kubenswrapper[4914]: I0130 22:01:07.848996 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0","Type":"ContainerStarted","Data":"9ea5d9213ad53bb1e26a334d43c3c8ad33e3df6f003a0b16d8d5592cc7ef4f52"} Jan 30 22:01:08 crc kubenswrapper[4914]: I0130 22:01:08.125994 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9sm9n" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" probeResult="failure" output=< Jan 30 22:01:08 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:01:08 crc kubenswrapper[4914]: > Jan 30 22:01:10 crc kubenswrapper[4914]: I0130 22:01:10.915760 4914 generic.go:334] "Generic (PLEG): container finished" podID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerID="0589fd50231b5bd0ee45b13e33f412e38d01e319d7d7142a4476d1014415e3f3" exitCode=0 Jan 30 22:01:10 crc kubenswrapper[4914]: I0130 22:01:10.916369 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxmlp" event={"ID":"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85","Type":"ContainerDied","Data":"0589fd50231b5bd0ee45b13e33f412e38d01e319d7d7142a4476d1014415e3f3"} Jan 30 22:01:10 crc kubenswrapper[4914]: I0130 22:01:10.925582 4914 generic.go:334] "Generic (PLEG): container finished" podID="1804d838-ccaf-46f1-a848-81790716a2f4" containerID="264981cd27293679fc102f7e7474dc61b21a877007d9483e41704f8fa6c886db" exitCode=0 Jan 30 22:01:10 crc kubenswrapper[4914]: I0130 22:01:10.925636 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496841-5k68x" event={"ID":"1804d838-ccaf-46f1-a848-81790716a2f4","Type":"ContainerDied","Data":"264981cd27293679fc102f7e7474dc61b21a877007d9483e41704f8fa6c886db"} Jan 30 22:01:15 crc kubenswrapper[4914]: I0130 22:01:15.578222 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:15 crc kubenswrapper[4914]: I0130 22:01:15.625793 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49d77\" (UniqueName: \"kubernetes.io/projected/1804d838-ccaf-46f1-a848-81790716a2f4-kube-api-access-49d77\") pod \"1804d838-ccaf-46f1-a848-81790716a2f4\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " Jan 30 22:01:15 crc kubenswrapper[4914]: I0130 22:01:15.625995 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-config-data\") pod \"1804d838-ccaf-46f1-a848-81790716a2f4\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " Jan 30 22:01:15 crc kubenswrapper[4914]: I0130 22:01:15.626097 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-combined-ca-bundle\") pod \"1804d838-ccaf-46f1-a848-81790716a2f4\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " Jan 30 22:01:15 crc kubenswrapper[4914]: I0130 22:01:15.626119 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-fernet-keys\") pod \"1804d838-ccaf-46f1-a848-81790716a2f4\" (UID: \"1804d838-ccaf-46f1-a848-81790716a2f4\") " Jan 30 22:01:15 crc kubenswrapper[4914]: I0130 22:01:15.645600 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1804d838-ccaf-46f1-a848-81790716a2f4-kube-api-access-49d77" (OuterVolumeSpecName: "kube-api-access-49d77") pod "1804d838-ccaf-46f1-a848-81790716a2f4" (UID: "1804d838-ccaf-46f1-a848-81790716a2f4"). InnerVolumeSpecName "kube-api-access-49d77". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:01:15 crc kubenswrapper[4914]: I0130 22:01:15.646303 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1804d838-ccaf-46f1-a848-81790716a2f4" (UID: "1804d838-ccaf-46f1-a848-81790716a2f4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:01:15 crc kubenswrapper[4914]: I0130 22:01:15.666968 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1804d838-ccaf-46f1-a848-81790716a2f4" (UID: "1804d838-ccaf-46f1-a848-81790716a2f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:01:15 crc kubenswrapper[4914]: I0130 22:01:15.713356 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-config-data" (OuterVolumeSpecName: "config-data") pod "1804d838-ccaf-46f1-a848-81790716a2f4" (UID: "1804d838-ccaf-46f1-a848-81790716a2f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:01:15 crc kubenswrapper[4914]: I0130 22:01:15.729784 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49d77\" (UniqueName: \"kubernetes.io/projected/1804d838-ccaf-46f1-a848-81790716a2f4-kube-api-access-49d77\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:15 crc kubenswrapper[4914]: I0130 22:01:15.729814 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:15 crc kubenswrapper[4914]: I0130 22:01:15.729829 4914 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:15 crc kubenswrapper[4914]: I0130 22:01:15.729839 4914 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1804d838-ccaf-46f1-a848-81790716a2f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:16 crc kubenswrapper[4914]: I0130 22:01:16.015345 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496841-5k68x" event={"ID":"1804d838-ccaf-46f1-a848-81790716a2f4","Type":"ContainerDied","Data":"83be5752023aa59f27ddcc8e984aaa486dc1bcabd9e4b00a1808015b3ee5be95"} Jan 30 22:01:16 crc kubenswrapper[4914]: I0130 22:01:16.015790 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83be5752023aa59f27ddcc8e984aaa486dc1bcabd9e4b00a1808015b3ee5be95" Jan 30 22:01:16 crc kubenswrapper[4914]: I0130 22:01:16.015418 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496841-5k68x" Jan 30 22:01:18 crc kubenswrapper[4914]: I0130 22:01:18.038242 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxmlp" event={"ID":"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85","Type":"ContainerStarted","Data":"8ff42410839555f72e0d2dee338ad41dd76810145281eed00301164388d781e8"} Jan 30 22:01:18 crc kubenswrapper[4914]: I0130 22:01:18.072848 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mxmlp" podStartSLOduration=4.136064991 podStartE2EDuration="15.07282851s" podCreationTimestamp="2026-01-30 22:01:03 +0000 UTC" firstStartedPulling="2026-01-30 22:01:05.800974552 +0000 UTC m=+2799.239611313" lastFinishedPulling="2026-01-30 22:01:16.737738071 +0000 UTC m=+2810.176374832" observedRunningTime="2026-01-30 22:01:18.055944743 +0000 UTC m=+2811.494581504" watchObservedRunningTime="2026-01-30 22:01:18.07282851 +0000 UTC m=+2811.511465271" Jan 30 22:01:18 crc kubenswrapper[4914]: I0130 22:01:18.123536 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9sm9n" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" probeResult="failure" output=< Jan 30 22:01:18 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:01:18 crc kubenswrapper[4914]: > Jan 30 22:01:24 crc kubenswrapper[4914]: I0130 22:01:24.042033 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:01:24 crc kubenswrapper[4914]: I0130 22:01:24.042635 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:01:25 crc kubenswrapper[4914]: I0130 22:01:25.117231 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mxmlp" podUID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerName="registry-server" probeResult="failure" output=< Jan 30 22:01:25 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:01:25 crc kubenswrapper[4914]: > Jan 30 22:01:28 crc kubenswrapper[4914]: I0130 22:01:28.132425 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9sm9n" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" probeResult="failure" output=< Jan 30 22:01:28 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:01:28 crc kubenswrapper[4914]: > Jan 30 22:01:29 crc kubenswrapper[4914]: I0130 22:01:29.940834 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w2jbb"] Jan 30 22:01:29 crc kubenswrapper[4914]: E0130 22:01:29.941772 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1804d838-ccaf-46f1-a848-81790716a2f4" containerName="keystone-cron" Jan 30 22:01:29 crc kubenswrapper[4914]: I0130 22:01:29.941790 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="1804d838-ccaf-46f1-a848-81790716a2f4" containerName="keystone-cron" Jan 30 22:01:29 crc kubenswrapper[4914]: I0130 22:01:29.942024 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="1804d838-ccaf-46f1-a848-81790716a2f4" containerName="keystone-cron" Jan 30 22:01:29 crc kubenswrapper[4914]: I0130 22:01:29.944163 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:01:29 crc kubenswrapper[4914]: I0130 22:01:29.970162 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w2jbb"] Jan 30 22:01:30 crc kubenswrapper[4914]: I0130 22:01:30.053287 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a275cea-d5a3-4f1b-9f75-82d757d114e4-catalog-content\") pod \"certified-operators-w2jbb\" (UID: \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\") " pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:01:30 crc kubenswrapper[4914]: I0130 22:01:30.053369 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9clvf\" (UniqueName: \"kubernetes.io/projected/3a275cea-d5a3-4f1b-9f75-82d757d114e4-kube-api-access-9clvf\") pod \"certified-operators-w2jbb\" (UID: \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\") " pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:01:30 crc kubenswrapper[4914]: I0130 22:01:30.053661 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a275cea-d5a3-4f1b-9f75-82d757d114e4-utilities\") pod \"certified-operators-w2jbb\" (UID: \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\") " pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:01:30 crc kubenswrapper[4914]: I0130 22:01:30.155559 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a275cea-d5a3-4f1b-9f75-82d757d114e4-catalog-content\") pod \"certified-operators-w2jbb\" (UID: \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\") " pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:01:30 crc kubenswrapper[4914]: I0130 22:01:30.155659 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9clvf\" (UniqueName: \"kubernetes.io/projected/3a275cea-d5a3-4f1b-9f75-82d757d114e4-kube-api-access-9clvf\") pod \"certified-operators-w2jbb\" (UID: \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\") " pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:01:30 crc kubenswrapper[4914]: I0130 22:01:30.155794 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a275cea-d5a3-4f1b-9f75-82d757d114e4-utilities\") pod \"certified-operators-w2jbb\" (UID: \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\") " pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:01:30 crc kubenswrapper[4914]: I0130 22:01:30.156098 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a275cea-d5a3-4f1b-9f75-82d757d114e4-catalog-content\") pod \"certified-operators-w2jbb\" (UID: \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\") " pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:01:30 crc kubenswrapper[4914]: I0130 22:01:30.156354 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a275cea-d5a3-4f1b-9f75-82d757d114e4-utilities\") pod \"certified-operators-w2jbb\" (UID: \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\") " pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:01:30 crc kubenswrapper[4914]: I0130 22:01:30.176055 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9clvf\" (UniqueName: \"kubernetes.io/projected/3a275cea-d5a3-4f1b-9f75-82d757d114e4-kube-api-access-9clvf\") pod \"certified-operators-w2jbb\" (UID: \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\") " pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:01:30 crc kubenswrapper[4914]: I0130 22:01:30.288176 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:01:35 crc kubenswrapper[4914]: I0130 22:01:35.097554 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mxmlp" podUID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerName="registry-server" probeResult="failure" output=< Jan 30 22:01:35 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:01:35 crc kubenswrapper[4914]: > Jan 30 22:01:38 crc kubenswrapper[4914]: I0130 22:01:38.115526 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9sm9n" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" probeResult="failure" output=< Jan 30 22:01:38 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:01:38 crc kubenswrapper[4914]: > Jan 30 22:01:45 crc kubenswrapper[4914]: I0130 22:01:45.095388 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mxmlp" podUID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerName="registry-server" probeResult="failure" output=< Jan 30 22:01:45 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:01:45 crc kubenswrapper[4914]: > Jan 30 22:01:48 crc kubenswrapper[4914]: I0130 22:01:48.111933 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9sm9n" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" probeResult="failure" output=< Jan 30 22:01:48 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:01:48 crc kubenswrapper[4914]: > Jan 30 22:01:55 crc kubenswrapper[4914]: I0130 22:01:55.108935 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mxmlp" podUID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerName="registry-server" probeResult="failure" output=< Jan 30 22:01:55 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:01:55 crc kubenswrapper[4914]: > Jan 30 22:01:56 crc kubenswrapper[4914]: I0130 22:01:56.983412 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:01:56 crc kubenswrapper[4914]: I0130 22:01:56.983811 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:01:58 crc kubenswrapper[4914]: I0130 22:01:58.107877 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9sm9n" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" probeResult="failure" output=< Jan 30 22:01:58 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:01:58 crc kubenswrapper[4914]: > Jan 30 22:01:58 crc kubenswrapper[4914]: I0130 22:01:58.794381 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w2jbb"] Jan 30 22:01:59 crc kubenswrapper[4914]: E0130 22:01:58.999831 4914 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Jan 30 22:01:59 crc kubenswrapper[4914]: E0130 22:01:59.000193 4914 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grzwk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 22:01:59 crc kubenswrapper[4914]: E0130 22:01:59.001357 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" Jan 30 22:01:59 crc kubenswrapper[4914]: I0130 22:01:59.488291 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2jbb" event={"ID":"3a275cea-d5a3-4f1b-9f75-82d757d114e4","Type":"ContainerStarted","Data":"848bf0ce06a56db526dab66e417efac67f06c69bb3a4c65dea6bab247e59df53"} Jan 30 22:01:59 crc kubenswrapper[4914]: E0130 22:01:59.489779 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" Jan 30 22:02:00 crc kubenswrapper[4914]: I0130 22:02:00.499032 4914 generic.go:334] "Generic (PLEG): container finished" podID="3a275cea-d5a3-4f1b-9f75-82d757d114e4" containerID="a1061cd94a83436f692a0e2f7f03050c9a4925b1a25122f4a619cde938478f78" exitCode=0 Jan 30 22:02:00 crc kubenswrapper[4914]: I0130 22:02:00.499148 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2jbb" event={"ID":"3a275cea-d5a3-4f1b-9f75-82d757d114e4","Type":"ContainerDied","Data":"a1061cd94a83436f692a0e2f7f03050c9a4925b1a25122f4a619cde938478f78"} Jan 30 22:02:02 crc kubenswrapper[4914]: I0130 22:02:02.530592 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2jbb" event={"ID":"3a275cea-d5a3-4f1b-9f75-82d757d114e4","Type":"ContainerStarted","Data":"e0462d3d5128f5f8f1e314f1b353959cdd5d3f08a267331570c659bb7330999d"} Jan 30 22:02:03 crc kubenswrapper[4914]: I0130 22:02:03.544004 4914 generic.go:334] "Generic (PLEG): container finished" podID="3a275cea-d5a3-4f1b-9f75-82d757d114e4" containerID="e0462d3d5128f5f8f1e314f1b353959cdd5d3f08a267331570c659bb7330999d" exitCode=0 Jan 30 22:02:03 crc kubenswrapper[4914]: I0130 22:02:03.544040 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2jbb" event={"ID":"3a275cea-d5a3-4f1b-9f75-82d757d114e4","Type":"ContainerDied","Data":"e0462d3d5128f5f8f1e314f1b353959cdd5d3f08a267331570c659bb7330999d"} Jan 30 22:02:04 crc kubenswrapper[4914]: I0130 22:02:04.556081 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2jbb" event={"ID":"3a275cea-d5a3-4f1b-9f75-82d757d114e4","Type":"ContainerStarted","Data":"9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94"} Jan 30 22:02:04 crc kubenswrapper[4914]: I0130 22:02:04.577920 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w2jbb" podStartSLOduration=31.820667627 podStartE2EDuration="35.577899928s" podCreationTimestamp="2026-01-30 22:01:29 +0000 UTC" firstStartedPulling="2026-01-30 22:02:00.500873806 +0000 UTC m=+2853.939510567" lastFinishedPulling="2026-01-30 22:02:04.258106117 +0000 UTC m=+2857.696742868" observedRunningTime="2026-01-30 22:02:04.574627569 +0000 UTC m=+2858.013264340" watchObservedRunningTime="2026-01-30 22:02:04.577899928 +0000 UTC m=+2858.016536699" Jan 30 22:02:05 crc kubenswrapper[4914]: I0130 22:02:05.086200 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mxmlp" podUID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerName="registry-server" probeResult="failure" output=< Jan 30 22:02:05 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:02:05 crc kubenswrapper[4914]: > Jan 30 22:02:08 crc kubenswrapper[4914]: I0130 22:02:08.133415 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9sm9n" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" probeResult="failure" output=< Jan 30 22:02:08 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:02:08 crc kubenswrapper[4914]: > Jan 30 22:02:10 crc kubenswrapper[4914]: I0130 22:02:10.289085 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:02:10 crc kubenswrapper[4914]: I0130 22:02:10.289620 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:02:10 crc kubenswrapper[4914]: I0130 22:02:10.339286 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:02:10 crc kubenswrapper[4914]: I0130 22:02:10.666784 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:02:10 crc kubenswrapper[4914]: I0130 22:02:10.720336 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w2jbb"] Jan 30 22:02:12 crc kubenswrapper[4914]: I0130 22:02:12.491155 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 30 22:02:12 crc kubenswrapper[4914]: I0130 22:02:12.627967 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w2jbb" podUID="3a275cea-d5a3-4f1b-9f75-82d757d114e4" containerName="registry-server" containerID="cri-o://9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94" gracePeriod=2 Jan 30 22:02:12 crc kubenswrapper[4914]: E0130 22:02:12.839375 4914 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a275cea_d5a3_4f1b_9f75_82d757d114e4.slice/crio-9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a275cea_d5a3_4f1b_9f75_82d757d114e4.slice/crio-conmon-9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94.scope\": RecentStats: unable to find data in memory cache]" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.143626 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.245957 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a275cea-d5a3-4f1b-9f75-82d757d114e4-utilities\") pod \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\" (UID: \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\") " Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.246247 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a275cea-d5a3-4f1b-9f75-82d757d114e4-catalog-content\") pod \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\" (UID: \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\") " Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.246308 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9clvf\" (UniqueName: \"kubernetes.io/projected/3a275cea-d5a3-4f1b-9f75-82d757d114e4-kube-api-access-9clvf\") pod \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\" (UID: \"3a275cea-d5a3-4f1b-9f75-82d757d114e4\") " Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.247573 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a275cea-d5a3-4f1b-9f75-82d757d114e4-utilities" (OuterVolumeSpecName: "utilities") pod "3a275cea-d5a3-4f1b-9f75-82d757d114e4" (UID: "3a275cea-d5a3-4f1b-9f75-82d757d114e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.255276 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a275cea-d5a3-4f1b-9f75-82d757d114e4-kube-api-access-9clvf" (OuterVolumeSpecName: "kube-api-access-9clvf") pod "3a275cea-d5a3-4f1b-9f75-82d757d114e4" (UID: "3a275cea-d5a3-4f1b-9f75-82d757d114e4"). InnerVolumeSpecName "kube-api-access-9clvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.301831 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a275cea-d5a3-4f1b-9f75-82d757d114e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a275cea-d5a3-4f1b-9f75-82d757d114e4" (UID: "3a275cea-d5a3-4f1b-9f75-82d757d114e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.349436 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a275cea-d5a3-4f1b-9f75-82d757d114e4-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.349469 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a275cea-d5a3-4f1b-9f75-82d757d114e4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.349483 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9clvf\" (UniqueName: \"kubernetes.io/projected/3a275cea-d5a3-4f1b-9f75-82d757d114e4-kube-api-access-9clvf\") on node \"crc\" DevicePath \"\"" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.639679 4914 generic.go:334] "Generic (PLEG): container finished" podID="3a275cea-d5a3-4f1b-9f75-82d757d114e4" containerID="9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94" exitCode=0 Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.639742 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2jbb" event={"ID":"3a275cea-d5a3-4f1b-9f75-82d757d114e4","Type":"ContainerDied","Data":"9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94"} Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.639765 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2jbb" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.639790 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2jbb" event={"ID":"3a275cea-d5a3-4f1b-9f75-82d757d114e4","Type":"ContainerDied","Data":"848bf0ce06a56db526dab66e417efac67f06c69bb3a4c65dea6bab247e59df53"} Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.639811 4914 scope.go:117] "RemoveContainer" containerID="9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.685074 4914 scope.go:117] "RemoveContainer" containerID="e0462d3d5128f5f8f1e314f1b353959cdd5d3f08a267331570c659bb7330999d" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.694795 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w2jbb"] Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.706895 4914 scope.go:117] "RemoveContainer" containerID="a1061cd94a83436f692a0e2f7f03050c9a4925b1a25122f4a619cde938478f78" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.710660 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w2jbb"] Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.771812 4914 scope.go:117] "RemoveContainer" containerID="9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94" Jan 30 22:02:13 crc kubenswrapper[4914]: E0130 22:02:13.772456 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94\": container with ID starting with 9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94 not found: ID does not exist" containerID="9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.772498 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94"} err="failed to get container status \"9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94\": rpc error: code = NotFound desc = could not find container \"9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94\": container with ID starting with 9b8a2b895767f51ae4477d28bb5bf492000a2081525c00f170d7c2702fa32c94 not found: ID does not exist" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.772531 4914 scope.go:117] "RemoveContainer" containerID="e0462d3d5128f5f8f1e314f1b353959cdd5d3f08a267331570c659bb7330999d" Jan 30 22:02:13 crc kubenswrapper[4914]: E0130 22:02:13.772985 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0462d3d5128f5f8f1e314f1b353959cdd5d3f08a267331570c659bb7330999d\": container with ID starting with e0462d3d5128f5f8f1e314f1b353959cdd5d3f08a267331570c659bb7330999d not found: ID does not exist" containerID="e0462d3d5128f5f8f1e314f1b353959cdd5d3f08a267331570c659bb7330999d" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.773015 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0462d3d5128f5f8f1e314f1b353959cdd5d3f08a267331570c659bb7330999d"} err="failed to get container status \"e0462d3d5128f5f8f1e314f1b353959cdd5d3f08a267331570c659bb7330999d\": rpc error: code = NotFound desc = could not find container \"e0462d3d5128f5f8f1e314f1b353959cdd5d3f08a267331570c659bb7330999d\": container with ID starting with e0462d3d5128f5f8f1e314f1b353959cdd5d3f08a267331570c659bb7330999d not found: ID does not exist" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.773037 4914 scope.go:117] "RemoveContainer" containerID="a1061cd94a83436f692a0e2f7f03050c9a4925b1a25122f4a619cde938478f78" Jan 30 22:02:13 crc kubenswrapper[4914]: E0130 22:02:13.773577 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1061cd94a83436f692a0e2f7f03050c9a4925b1a25122f4a619cde938478f78\": container with ID starting with a1061cd94a83436f692a0e2f7f03050c9a4925b1a25122f4a619cde938478f78 not found: ID does not exist" containerID="a1061cd94a83436f692a0e2f7f03050c9a4925b1a25122f4a619cde938478f78" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.773598 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1061cd94a83436f692a0e2f7f03050c9a4925b1a25122f4a619cde938478f78"} err="failed to get container status \"a1061cd94a83436f692a0e2f7f03050c9a4925b1a25122f4a619cde938478f78\": rpc error: code = NotFound desc = could not find container \"a1061cd94a83436f692a0e2f7f03050c9a4925b1a25122f4a619cde938478f78\": container with ID starting with a1061cd94a83436f692a0e2f7f03050c9a4925b1a25122f4a619cde938478f78 not found: ID does not exist" Jan 30 22:02:13 crc kubenswrapper[4914]: I0130 22:02:13.833259 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a275cea-d5a3-4f1b-9f75-82d757d114e4" path="/var/lib/kubelet/pods/3a275cea-d5a3-4f1b-9f75-82d757d114e4/volumes" Jan 30 22:02:14 crc kubenswrapper[4914]: I0130 22:02:14.088576 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:02:14 crc kubenswrapper[4914]: I0130 22:02:14.146301 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:02:14 crc kubenswrapper[4914]: I0130 22:02:14.658208 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0","Type":"ContainerStarted","Data":"b19c561b439e909b6e38dd9791f3efedab3976625b519896135a1a981ca940ac"} Jan 30 22:02:14 crc kubenswrapper[4914]: I0130 22:02:14.681981 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.267080094 podStartE2EDuration="1m9.681960396s" podCreationTimestamp="2026-01-30 22:01:05 +0000 UTC" firstStartedPulling="2026-01-30 22:01:07.069302133 +0000 UTC m=+2800.507938894" lastFinishedPulling="2026-01-30 22:02:12.484182435 +0000 UTC m=+2865.922819196" observedRunningTime="2026-01-30 22:02:14.676931975 +0000 UTC m=+2868.115568756" watchObservedRunningTime="2026-01-30 22:02:14.681960396 +0000 UTC m=+2868.120597177" Jan 30 22:02:15 crc kubenswrapper[4914]: I0130 22:02:15.975920 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxmlp"] Jan 30 22:02:15 crc kubenswrapper[4914]: I0130 22:02:15.976346 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mxmlp" podUID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerName="registry-server" containerID="cri-o://8ff42410839555f72e0d2dee338ad41dd76810145281eed00301164388d781e8" gracePeriod=2 Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.522466 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.634552 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-catalog-content\") pod \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\" (UID: \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\") " Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.634948 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsr9t\" (UniqueName: \"kubernetes.io/projected/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-kube-api-access-gsr9t\") pod \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\" (UID: \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\") " Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.635040 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-utilities\") pod \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\" (UID: \"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85\") " Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.636420 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-utilities" (OuterVolumeSpecName: "utilities") pod "7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" (UID: "7b4d1e9e-b50e-4947-a6ca-6ec48958ed85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.643047 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-kube-api-access-gsr9t" (OuterVolumeSpecName: "kube-api-access-gsr9t") pod "7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" (UID: "7b4d1e9e-b50e-4947-a6ca-6ec48958ed85"). InnerVolumeSpecName "kube-api-access-gsr9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.662033 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" (UID: "7b4d1e9e-b50e-4947-a6ca-6ec48958ed85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.684674 4914 generic.go:334] "Generic (PLEG): container finished" podID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerID="8ff42410839555f72e0d2dee338ad41dd76810145281eed00301164388d781e8" exitCode=0 Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.684694 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxmlp" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.684732 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxmlp" event={"ID":"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85","Type":"ContainerDied","Data":"8ff42410839555f72e0d2dee338ad41dd76810145281eed00301164388d781e8"} Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.685149 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxmlp" event={"ID":"7b4d1e9e-b50e-4947-a6ca-6ec48958ed85","Type":"ContainerDied","Data":"9e1176ba960c6d00ccd77d9053be5abdffbeec01cf8f8dcfd0217203e96b7a5a"} Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.685175 4914 scope.go:117] "RemoveContainer" containerID="8ff42410839555f72e0d2dee338ad41dd76810145281eed00301164388d781e8" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.735877 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxmlp"] Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.738742 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.738780 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsr9t\" (UniqueName: \"kubernetes.io/projected/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-kube-api-access-gsr9t\") on node \"crc\" DevicePath \"\"" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.738796 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.740130 4914 scope.go:117] "RemoveContainer" containerID="0589fd50231b5bd0ee45b13e33f412e38d01e319d7d7142a4476d1014415e3f3" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.748433 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxmlp"] Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.769292 4914 scope.go:117] "RemoveContainer" containerID="87e59c43ac16d60af7ae1fa2bbb959eb29164f595f296ee80d503422499e6795" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.828919 4914 scope.go:117] "RemoveContainer" containerID="8ff42410839555f72e0d2dee338ad41dd76810145281eed00301164388d781e8" Jan 30 22:02:16 crc kubenswrapper[4914]: E0130 22:02:16.830199 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff42410839555f72e0d2dee338ad41dd76810145281eed00301164388d781e8\": container with ID starting with 8ff42410839555f72e0d2dee338ad41dd76810145281eed00301164388d781e8 not found: ID does not exist" containerID="8ff42410839555f72e0d2dee338ad41dd76810145281eed00301164388d781e8" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.830232 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff42410839555f72e0d2dee338ad41dd76810145281eed00301164388d781e8"} err="failed to get container status \"8ff42410839555f72e0d2dee338ad41dd76810145281eed00301164388d781e8\": rpc error: code = NotFound desc = could not find container \"8ff42410839555f72e0d2dee338ad41dd76810145281eed00301164388d781e8\": container with ID starting with 8ff42410839555f72e0d2dee338ad41dd76810145281eed00301164388d781e8 not found: ID does not exist" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.830254 4914 scope.go:117] "RemoveContainer" containerID="0589fd50231b5bd0ee45b13e33f412e38d01e319d7d7142a4476d1014415e3f3" Jan 30 22:02:16 crc kubenswrapper[4914]: E0130 22:02:16.830814 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0589fd50231b5bd0ee45b13e33f412e38d01e319d7d7142a4476d1014415e3f3\": container with ID starting with 0589fd50231b5bd0ee45b13e33f412e38d01e319d7d7142a4476d1014415e3f3 not found: ID does not exist" containerID="0589fd50231b5bd0ee45b13e33f412e38d01e319d7d7142a4476d1014415e3f3" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.830863 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0589fd50231b5bd0ee45b13e33f412e38d01e319d7d7142a4476d1014415e3f3"} err="failed to get container status \"0589fd50231b5bd0ee45b13e33f412e38d01e319d7d7142a4476d1014415e3f3\": rpc error: code = NotFound desc = could not find container \"0589fd50231b5bd0ee45b13e33f412e38d01e319d7d7142a4476d1014415e3f3\": container with ID starting with 0589fd50231b5bd0ee45b13e33f412e38d01e319d7d7142a4476d1014415e3f3 not found: ID does not exist" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.830944 4914 scope.go:117] "RemoveContainer" containerID="87e59c43ac16d60af7ae1fa2bbb959eb29164f595f296ee80d503422499e6795" Jan 30 22:02:16 crc kubenswrapper[4914]: E0130 22:02:16.831812 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87e59c43ac16d60af7ae1fa2bbb959eb29164f595f296ee80d503422499e6795\": container with ID starting with 87e59c43ac16d60af7ae1fa2bbb959eb29164f595f296ee80d503422499e6795 not found: ID does not exist" containerID="87e59c43ac16d60af7ae1fa2bbb959eb29164f595f296ee80d503422499e6795" Jan 30 22:02:16 crc kubenswrapper[4914]: I0130 22:02:16.831867 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87e59c43ac16d60af7ae1fa2bbb959eb29164f595f296ee80d503422499e6795"} err="failed to get container status \"87e59c43ac16d60af7ae1fa2bbb959eb29164f595f296ee80d503422499e6795\": rpc error: code = NotFound desc = could not find container \"87e59c43ac16d60af7ae1fa2bbb959eb29164f595f296ee80d503422499e6795\": container with ID starting with 87e59c43ac16d60af7ae1fa2bbb959eb29164f595f296ee80d503422499e6795 not found: ID does not exist" Jan 30 22:02:17 crc kubenswrapper[4914]: I0130 22:02:17.831726 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" path="/var/lib/kubelet/pods/7b4d1e9e-b50e-4947-a6ca-6ec48958ed85/volumes" Jan 30 22:02:18 crc kubenswrapper[4914]: I0130 22:02:18.113007 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9sm9n" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" probeResult="failure" output=< Jan 30 22:02:18 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:02:18 crc kubenswrapper[4914]: > Jan 30 22:02:26 crc kubenswrapper[4914]: I0130 22:02:26.983253 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:02:26 crc kubenswrapper[4914]: I0130 22:02:26.983901 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:02:28 crc kubenswrapper[4914]: I0130 22:02:28.113220 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9sm9n" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" probeResult="failure" output=< Jan 30 22:02:28 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:02:28 crc kubenswrapper[4914]: > Jan 30 22:02:38 crc kubenswrapper[4914]: I0130 22:02:38.122139 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9sm9n" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" probeResult="failure" output=< Jan 30 22:02:38 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:02:38 crc kubenswrapper[4914]: > Jan 30 22:02:38 crc kubenswrapper[4914]: I0130 22:02:38.122732 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:02:38 crc kubenswrapper[4914]: I0130 22:02:38.123600 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"09e34b32161ba3c16082216f9a02c9739b0ffbc02054e08b289e3f789c81cd10"} pod="openshift-marketplace/redhat-operators-9sm9n" containerMessage="Container registry-server failed startup probe, will be restarted" Jan 30 22:02:38 crc kubenswrapper[4914]: I0130 22:02:38.123637 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9sm9n" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" containerID="cri-o://09e34b32161ba3c16082216f9a02c9739b0ffbc02054e08b289e3f789c81cd10" gracePeriod=30 Jan 30 22:02:52 crc kubenswrapper[4914]: I0130 22:02:52.344524 4914 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 22:02:53 crc kubenswrapper[4914]: I0130 22:02:53.066827 4914 generic.go:334] "Generic (PLEG): container finished" podID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerID="09e34b32161ba3c16082216f9a02c9739b0ffbc02054e08b289e3f789c81cd10" exitCode=0 Jan 30 22:02:53 crc kubenswrapper[4914]: I0130 22:02:53.066918 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sm9n" event={"ID":"b4c86b39-743b-4e2f-b07d-26a46d2f55ec","Type":"ContainerDied","Data":"09e34b32161ba3c16082216f9a02c9739b0ffbc02054e08b289e3f789c81cd10"} Jan 30 22:02:53 crc kubenswrapper[4914]: I0130 22:02:53.067252 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sm9n" event={"ID":"b4c86b39-743b-4e2f-b07d-26a46d2f55ec","Type":"ContainerStarted","Data":"d0af67913c3740c85d757fea07e0fd51959acd56b201eb71a1fe230f82b26d22"} Jan 30 22:02:56 crc kubenswrapper[4914]: I0130 22:02:56.982964 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:02:56 crc kubenswrapper[4914]: I0130 22:02:56.983572 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:02:56 crc kubenswrapper[4914]: I0130 22:02:56.983650 4914 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 22:02:56 crc kubenswrapper[4914]: I0130 22:02:56.986283 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c7d42f95932576854d92f120d17b18c9c987ff1795ac1cf8cef01b20f9ab65c"} pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 22:02:56 crc kubenswrapper[4914]: I0130 22:02:56.986379 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" containerID="cri-o://8c7d42f95932576854d92f120d17b18c9c987ff1795ac1cf8cef01b20f9ab65c" gracePeriod=600 Jan 30 22:02:57 crc kubenswrapper[4914]: I0130 22:02:57.066733 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:02:57 crc kubenswrapper[4914]: I0130 22:02:57.067019 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:02:58 crc kubenswrapper[4914]: I0130 22:02:58.135930 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9sm9n" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" probeResult="failure" output=< Jan 30 22:02:58 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:02:58 crc kubenswrapper[4914]: > Jan 30 22:02:58 crc kubenswrapper[4914]: I0130 22:02:58.140695 4914 generic.go:334] "Generic (PLEG): container finished" podID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerID="8c7d42f95932576854d92f120d17b18c9c987ff1795ac1cf8cef01b20f9ab65c" exitCode=0 Jan 30 22:02:58 crc kubenswrapper[4914]: I0130 22:02:58.140735 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerDied","Data":"8c7d42f95932576854d92f120d17b18c9c987ff1795ac1cf8cef01b20f9ab65c"} Jan 30 22:02:58 crc kubenswrapper[4914]: I0130 22:02:58.140793 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650"} Jan 30 22:02:58 crc kubenswrapper[4914]: I0130 22:02:58.140831 4914 scope.go:117] "RemoveContainer" containerID="97b1dcd5f06015c42abd8ae400b3a3ca51d72ba67e709a09e7adcf8fd0e842e5" Jan 30 22:03:07 crc kubenswrapper[4914]: I0130 22:03:07.114291 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:03:07 crc kubenswrapper[4914]: I0130 22:03:07.167595 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:03:07 crc kubenswrapper[4914]: I0130 22:03:07.362583 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9sm9n"] Jan 30 22:03:08 crc kubenswrapper[4914]: I0130 22:03:08.227665 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9sm9n" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" containerID="cri-o://d0af67913c3740c85d757fea07e0fd51959acd56b201eb71a1fe230f82b26d22" gracePeriod=2 Jan 30 22:03:08 crc kubenswrapper[4914]: I0130 22:03:08.812046 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:03:08 crc kubenswrapper[4914]: I0130 22:03:08.972541 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gsvj\" (UniqueName: \"kubernetes.io/projected/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-kube-api-access-8gsvj\") pod \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\" (UID: \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\") " Jan 30 22:03:08 crc kubenswrapper[4914]: I0130 22:03:08.972646 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-utilities\") pod \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\" (UID: \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\") " Jan 30 22:03:08 crc kubenswrapper[4914]: I0130 22:03:08.972874 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-catalog-content\") pod \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\" (UID: \"b4c86b39-743b-4e2f-b07d-26a46d2f55ec\") " Jan 30 22:03:08 crc kubenswrapper[4914]: I0130 22:03:08.973319 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-utilities" (OuterVolumeSpecName: "utilities") pod "b4c86b39-743b-4e2f-b07d-26a46d2f55ec" (UID: "b4c86b39-743b-4e2f-b07d-26a46d2f55ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:03:08 crc kubenswrapper[4914]: I0130 22:03:08.973588 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:03:08 crc kubenswrapper[4914]: I0130 22:03:08.985905 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-kube-api-access-8gsvj" (OuterVolumeSpecName: "kube-api-access-8gsvj") pod "b4c86b39-743b-4e2f-b07d-26a46d2f55ec" (UID: "b4c86b39-743b-4e2f-b07d-26a46d2f55ec"). InnerVolumeSpecName "kube-api-access-8gsvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.076519 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gsvj\" (UniqueName: \"kubernetes.io/projected/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-kube-api-access-8gsvj\") on node \"crc\" DevicePath \"\"" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.114059 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4c86b39-743b-4e2f-b07d-26a46d2f55ec" (UID: "b4c86b39-743b-4e2f-b07d-26a46d2f55ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.178852 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4c86b39-743b-4e2f-b07d-26a46d2f55ec-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.237937 4914 generic.go:334] "Generic (PLEG): container finished" podID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerID="d0af67913c3740c85d757fea07e0fd51959acd56b201eb71a1fe230f82b26d22" exitCode=0 Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.237979 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sm9n" event={"ID":"b4c86b39-743b-4e2f-b07d-26a46d2f55ec","Type":"ContainerDied","Data":"d0af67913c3740c85d757fea07e0fd51959acd56b201eb71a1fe230f82b26d22"} Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.238009 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9sm9n" event={"ID":"b4c86b39-743b-4e2f-b07d-26a46d2f55ec","Type":"ContainerDied","Data":"097e11028a2812d64ef93fce0fd90f9ec201ae7dd387adf98a28f3b69cec9694"} Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.238027 4914 scope.go:117] "RemoveContainer" containerID="d0af67913c3740c85d757fea07e0fd51959acd56b201eb71a1fe230f82b26d22" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.238048 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9sm9n" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.267284 4914 scope.go:117] "RemoveContainer" containerID="09e34b32161ba3c16082216f9a02c9739b0ffbc02054e08b289e3f789c81cd10" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.280578 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9sm9n"] Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.289738 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9sm9n"] Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.297007 4914 scope.go:117] "RemoveContainer" containerID="f48afbf95e9aced1248f9cf8c5379a6a97c97df8adbdaa3605e088b149c2fbf8" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.325471 4914 scope.go:117] "RemoveContainer" containerID="e1403d7a9202ad97d108a5584f661c4d575ecfedfe51bd79f5379f71476fde24" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.365495 4914 scope.go:117] "RemoveContainer" containerID="d0af67913c3740c85d757fea07e0fd51959acd56b201eb71a1fe230f82b26d22" Jan 30 22:03:09 crc kubenswrapper[4914]: E0130 22:03:09.370369 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0af67913c3740c85d757fea07e0fd51959acd56b201eb71a1fe230f82b26d22\": container with ID starting with d0af67913c3740c85d757fea07e0fd51959acd56b201eb71a1fe230f82b26d22 not found: ID does not exist" containerID="d0af67913c3740c85d757fea07e0fd51959acd56b201eb71a1fe230f82b26d22" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.370425 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0af67913c3740c85d757fea07e0fd51959acd56b201eb71a1fe230f82b26d22"} err="failed to get container status \"d0af67913c3740c85d757fea07e0fd51959acd56b201eb71a1fe230f82b26d22\": rpc error: code = NotFound desc = could not find container \"d0af67913c3740c85d757fea07e0fd51959acd56b201eb71a1fe230f82b26d22\": container with ID starting with d0af67913c3740c85d757fea07e0fd51959acd56b201eb71a1fe230f82b26d22 not found: ID does not exist" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.370485 4914 scope.go:117] "RemoveContainer" containerID="09e34b32161ba3c16082216f9a02c9739b0ffbc02054e08b289e3f789c81cd10" Jan 30 22:03:09 crc kubenswrapper[4914]: E0130 22:03:09.371102 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e34b32161ba3c16082216f9a02c9739b0ffbc02054e08b289e3f789c81cd10\": container with ID starting with 09e34b32161ba3c16082216f9a02c9739b0ffbc02054e08b289e3f789c81cd10 not found: ID does not exist" containerID="09e34b32161ba3c16082216f9a02c9739b0ffbc02054e08b289e3f789c81cd10" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.371147 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e34b32161ba3c16082216f9a02c9739b0ffbc02054e08b289e3f789c81cd10"} err="failed to get container status \"09e34b32161ba3c16082216f9a02c9739b0ffbc02054e08b289e3f789c81cd10\": rpc error: code = NotFound desc = could not find container \"09e34b32161ba3c16082216f9a02c9739b0ffbc02054e08b289e3f789c81cd10\": container with ID starting with 09e34b32161ba3c16082216f9a02c9739b0ffbc02054e08b289e3f789c81cd10 not found: ID does not exist" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.371196 4914 scope.go:117] "RemoveContainer" containerID="f48afbf95e9aced1248f9cf8c5379a6a97c97df8adbdaa3605e088b149c2fbf8" Jan 30 22:03:09 crc kubenswrapper[4914]: E0130 22:03:09.371570 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f48afbf95e9aced1248f9cf8c5379a6a97c97df8adbdaa3605e088b149c2fbf8\": container with ID starting with f48afbf95e9aced1248f9cf8c5379a6a97c97df8adbdaa3605e088b149c2fbf8 not found: ID does not exist" containerID="f48afbf95e9aced1248f9cf8c5379a6a97c97df8adbdaa3605e088b149c2fbf8" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.371607 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f48afbf95e9aced1248f9cf8c5379a6a97c97df8adbdaa3605e088b149c2fbf8"} err="failed to get container status \"f48afbf95e9aced1248f9cf8c5379a6a97c97df8adbdaa3605e088b149c2fbf8\": rpc error: code = NotFound desc = could not find container \"f48afbf95e9aced1248f9cf8c5379a6a97c97df8adbdaa3605e088b149c2fbf8\": container with ID starting with f48afbf95e9aced1248f9cf8c5379a6a97c97df8adbdaa3605e088b149c2fbf8 not found: ID does not exist" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.371627 4914 scope.go:117] "RemoveContainer" containerID="e1403d7a9202ad97d108a5584f661c4d575ecfedfe51bd79f5379f71476fde24" Jan 30 22:03:09 crc kubenswrapper[4914]: E0130 22:03:09.372059 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1403d7a9202ad97d108a5584f661c4d575ecfedfe51bd79f5379f71476fde24\": container with ID starting with e1403d7a9202ad97d108a5584f661c4d575ecfedfe51bd79f5379f71476fde24 not found: ID does not exist" containerID="e1403d7a9202ad97d108a5584f661c4d575ecfedfe51bd79f5379f71476fde24" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.372122 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1403d7a9202ad97d108a5584f661c4d575ecfedfe51bd79f5379f71476fde24"} err="failed to get container status \"e1403d7a9202ad97d108a5584f661c4d575ecfedfe51bd79f5379f71476fde24\": rpc error: code = NotFound desc = could not find container \"e1403d7a9202ad97d108a5584f661c4d575ecfedfe51bd79f5379f71476fde24\": container with ID starting with e1403d7a9202ad97d108a5584f661c4d575ecfedfe51bd79f5379f71476fde24 not found: ID does not exist" Jan 30 22:03:09 crc kubenswrapper[4914]: I0130 22:03:09.832029 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" path="/var/lib/kubelet/pods/b4c86b39-743b-4e2f-b07d-26a46d2f55ec/volumes" Jan 30 22:05:26 crc kubenswrapper[4914]: I0130 22:05:26.983623 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:05:26 crc kubenswrapper[4914]: I0130 22:05:26.984146 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:05:56 crc kubenswrapper[4914]: I0130 22:05:56.983796 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:05:56 crc kubenswrapper[4914]: I0130 22:05:56.984368 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:06:26 crc kubenswrapper[4914]: I0130 22:06:26.983478 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:06:26 crc kubenswrapper[4914]: I0130 22:06:26.984073 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:06:26 crc kubenswrapper[4914]: I0130 22:06:26.984124 4914 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 22:06:26 crc kubenswrapper[4914]: I0130 22:06:26.985126 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650"} pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 22:06:26 crc kubenswrapper[4914]: I0130 22:06:26.985194 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" containerID="cri-o://22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" gracePeriod=600 Jan 30 22:06:27 crc kubenswrapper[4914]: E0130 22:06:27.108371 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:06:27 crc kubenswrapper[4914]: I0130 22:06:27.149113 4914 generic.go:334] "Generic (PLEG): container finished" podID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" exitCode=0 Jan 30 22:06:27 crc kubenswrapper[4914]: I0130 22:06:27.149153 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerDied","Data":"22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650"} Jan 30 22:06:27 crc kubenswrapper[4914]: I0130 22:06:27.149190 4914 scope.go:117] "RemoveContainer" containerID="8c7d42f95932576854d92f120d17b18c9c987ff1795ac1cf8cef01b20f9ab65c" Jan 30 22:06:27 crc kubenswrapper[4914]: I0130 22:06:27.149852 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:06:27 crc kubenswrapper[4914]: E0130 22:06:27.150153 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:06:40 crc kubenswrapper[4914]: I0130 22:06:40.817967 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:06:40 crc kubenswrapper[4914]: E0130 22:06:40.818763 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:06:55 crc kubenswrapper[4914]: I0130 22:06:55.818091 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:06:55 crc kubenswrapper[4914]: E0130 22:06:55.818797 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:07:10 crc kubenswrapper[4914]: I0130 22:07:10.818112 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:07:10 crc kubenswrapper[4914]: E0130 22:07:10.819006 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:07:22 crc kubenswrapper[4914]: I0130 22:07:22.817685 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:07:22 crc kubenswrapper[4914]: E0130 22:07:22.818342 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:07:37 crc kubenswrapper[4914]: I0130 22:07:37.825358 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:07:37 crc kubenswrapper[4914]: E0130 22:07:37.826130 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:07:51 crc kubenswrapper[4914]: I0130 22:07:51.953183 4914 generic.go:334] "Generic (PLEG): container finished" podID="3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" containerID="b19c561b439e909b6e38dd9791f3efedab3976625b519896135a1a981ca940ac" exitCode=0 Jan 30 22:07:51 crc kubenswrapper[4914]: I0130 22:07:51.953226 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0","Type":"ContainerDied","Data":"b19c561b439e909b6e38dd9791f3efedab3976625b519896135a1a981ca940ac"} Jan 30 22:07:52 crc kubenswrapper[4914]: I0130 22:07:52.819749 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:07:52 crc kubenswrapper[4914]: E0130 22:07:52.820359 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.638212 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.773254 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-ca-certs\") pod \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.773318 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-test-operator-ephemeral-temporary\") pod \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.773350 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-ssh-key\") pod \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.773372 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-openstack-config-secret\") pod \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.773426 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-openstack-config\") pod \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.773905 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.774415 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grzwk\" (UniqueName: \"kubernetes.io/projected/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-kube-api-access-grzwk\") pod \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.774461 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-config-data\") pod \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.774307 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" (UID: "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.774566 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-test-operator-ephemeral-workdir\") pod \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\" (UID: \"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0\") " Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.775484 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-config-data" (OuterVolumeSpecName: "config-data") pod "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" (UID: "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.776081 4914 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.776181 4914 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.778756 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "test-operator-logs") pod "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" (UID: "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.795355 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-kube-api-access-grzwk" (OuterVolumeSpecName: "kube-api-access-grzwk") pod "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" (UID: "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0"). InnerVolumeSpecName "kube-api-access-grzwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.803133 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" (UID: "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.804870 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" (UID: "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.806463 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" (UID: "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.833512 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" (UID: "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.878919 4914 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.878985 4914 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.879000 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grzwk\" (UniqueName: \"kubernetes.io/projected/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-kube-api-access-grzwk\") on node \"crc\" DevicePath \"\"" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.879014 4914 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.879025 4914 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.879038 4914 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.901528 4914 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.983246 4914 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.986807 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0","Type":"ContainerDied","Data":"9ea5d9213ad53bb1e26a334d43c3c8ad33e3df6f003a0b16d8d5592cc7ef4f52"} Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.987033 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ea5d9213ad53bb1e26a334d43c3c8ad33e3df6f003a0b16d8d5592cc7ef4f52" Jan 30 22:07:53 crc kubenswrapper[4914]: I0130 22:07:53.986881 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 22:07:54 crc kubenswrapper[4914]: I0130 22:07:54.178967 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" (UID: "3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:07:54 crc kubenswrapper[4914]: I0130 22:07:54.187878 4914 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 30 22:08:04 crc kubenswrapper[4914]: I0130 22:08:04.818517 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:08:04 crc kubenswrapper[4914]: E0130 22:08:04.819763 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.762134 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 30 22:08:05 crc kubenswrapper[4914]: E0130 22:08:05.762801 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="extract-utilities" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.762817 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="extract-utilities" Jan 30 22:08:05 crc kubenswrapper[4914]: E0130 22:08:05.762826 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a275cea-d5a3-4f1b-9f75-82d757d114e4" containerName="extract-utilities" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.762833 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a275cea-d5a3-4f1b-9f75-82d757d114e4" containerName="extract-utilities" Jan 30 22:08:05 crc kubenswrapper[4914]: E0130 22:08:05.762842 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a275cea-d5a3-4f1b-9f75-82d757d114e4" containerName="extract-content" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.762849 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a275cea-d5a3-4f1b-9f75-82d757d114e4" containerName="extract-content" Jan 30 22:08:05 crc kubenswrapper[4914]: E0130 22:08:05.762859 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.762865 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" Jan 30 22:08:05 crc kubenswrapper[4914]: E0130 22:08:05.762871 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerName="extract-content" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.762877 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerName="extract-content" Jan 30 22:08:05 crc kubenswrapper[4914]: E0130 22:08:05.762905 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerName="extract-utilities" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.762910 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerName="extract-utilities" Jan 30 22:08:05 crc kubenswrapper[4914]: E0130 22:08:05.762921 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="extract-content" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.762927 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="extract-content" Jan 30 22:08:05 crc kubenswrapper[4914]: E0130 22:08:05.762934 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerName="registry-server" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.762940 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerName="registry-server" Jan 30 22:08:05 crc kubenswrapper[4914]: E0130 22:08:05.762963 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a275cea-d5a3-4f1b-9f75-82d757d114e4" containerName="registry-server" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.762969 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a275cea-d5a3-4f1b-9f75-82d757d114e4" containerName="registry-server" Jan 30 22:08:05 crc kubenswrapper[4914]: E0130 22:08:05.762989 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" containerName="tempest-tests-tempest-tests-runner" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.762995 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" containerName="tempest-tests-tempest-tests-runner" Jan 30 22:08:05 crc kubenswrapper[4914]: E0130 22:08:05.763006 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.763011 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.763179 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b4d1e9e-b50e-4947-a6ca-6ec48958ed85" containerName="registry-server" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.763196 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.763204 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a275cea-d5a3-4f1b-9f75-82d757d114e4" containerName="registry-server" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.763212 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c86b39-743b-4e2f-b07d-26a46d2f55ec" containerName="registry-server" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.763224 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0" containerName="tempest-tests-tempest-tests-runner" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.763896 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.765886 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-p8sjb" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.792341 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.961665 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"02341028-de3c-48cc-af26-413ff79477a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:08:05 crc kubenswrapper[4914]: I0130 22:08:05.961864 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p78rm\" (UniqueName: \"kubernetes.io/projected/02341028-de3c-48cc-af26-413ff79477a0-kube-api-access-p78rm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"02341028-de3c-48cc-af26-413ff79477a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:08:06 crc kubenswrapper[4914]: I0130 22:08:06.064287 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"02341028-de3c-48cc-af26-413ff79477a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:08:06 crc kubenswrapper[4914]: I0130 22:08:06.064380 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p78rm\" (UniqueName: \"kubernetes.io/projected/02341028-de3c-48cc-af26-413ff79477a0-kube-api-access-p78rm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"02341028-de3c-48cc-af26-413ff79477a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:08:06 crc kubenswrapper[4914]: I0130 22:08:06.064834 4914 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"02341028-de3c-48cc-af26-413ff79477a0\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:08:06 crc kubenswrapper[4914]: I0130 22:08:06.104613 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p78rm\" (UniqueName: \"kubernetes.io/projected/02341028-de3c-48cc-af26-413ff79477a0-kube-api-access-p78rm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"02341028-de3c-48cc-af26-413ff79477a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:08:06 crc kubenswrapper[4914]: I0130 22:08:06.125944 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"02341028-de3c-48cc-af26-413ff79477a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:08:06 crc kubenswrapper[4914]: I0130 22:08:06.388006 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:08:06 crc kubenswrapper[4914]: I0130 22:08:06.871413 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 30 22:08:06 crc kubenswrapper[4914]: I0130 22:08:06.877128 4914 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 22:08:07 crc kubenswrapper[4914]: I0130 22:08:07.136772 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"02341028-de3c-48cc-af26-413ff79477a0","Type":"ContainerStarted","Data":"a97ea7efc8efef20beb73b8436d9aebee9cd839b39a4c5f9f051025651a1e8df"} Jan 30 22:08:12 crc kubenswrapper[4914]: I0130 22:08:12.182195 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"02341028-de3c-48cc-af26-413ff79477a0","Type":"ContainerStarted","Data":"f63ea21244c397d3e0a80d03f8790c0cbe1b66858d951ffb9245a0fe2f3746d0"} Jan 30 22:08:12 crc kubenswrapper[4914]: I0130 22:08:12.197330 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=3.438536204 podStartE2EDuration="7.197309871s" podCreationTimestamp="2026-01-30 22:08:05 +0000 UTC" firstStartedPulling="2026-01-30 22:08:06.876870756 +0000 UTC m=+3220.315507517" lastFinishedPulling="2026-01-30 22:08:10.635644423 +0000 UTC m=+3224.074281184" observedRunningTime="2026-01-30 22:08:12.194322139 +0000 UTC m=+3225.632958910" watchObservedRunningTime="2026-01-30 22:08:12.197309871 +0000 UTC m=+3225.635946632" Jan 30 22:08:19 crc kubenswrapper[4914]: I0130 22:08:19.818837 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:08:19 crc kubenswrapper[4914]: E0130 22:08:19.820901 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:08:33 crc kubenswrapper[4914]: I0130 22:08:33.819040 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:08:33 crc kubenswrapper[4914]: E0130 22:08:33.819957 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:08:46 crc kubenswrapper[4914]: I0130 22:08:46.818667 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:08:46 crc kubenswrapper[4914]: E0130 22:08:46.819474 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:08:56 crc kubenswrapper[4914]: I0130 22:08:56.287598 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5gk9h/must-gather-4w8kz"] Jan 30 22:08:56 crc kubenswrapper[4914]: I0130 22:08:56.289861 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/must-gather-4w8kz" Jan 30 22:08:56 crc kubenswrapper[4914]: I0130 22:08:56.292594 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5gk9h"/"default-dockercfg-4bq9p" Jan 30 22:08:56 crc kubenswrapper[4914]: I0130 22:08:56.292860 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5gk9h"/"openshift-service-ca.crt" Jan 30 22:08:56 crc kubenswrapper[4914]: I0130 22:08:56.293563 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5gk9h"/"kube-root-ca.crt" Jan 30 22:08:56 crc kubenswrapper[4914]: I0130 22:08:56.317392 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5gk9h/must-gather-4w8kz"] Jan 30 22:08:56 crc kubenswrapper[4914]: I0130 22:08:56.444251 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0-must-gather-output\") pod \"must-gather-4w8kz\" (UID: \"3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0\") " pod="openshift-must-gather-5gk9h/must-gather-4w8kz" Jan 30 22:08:56 crc kubenswrapper[4914]: I0130 22:08:56.444726 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbmmt\" (UniqueName: \"kubernetes.io/projected/3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0-kube-api-access-xbmmt\") pod \"must-gather-4w8kz\" (UID: \"3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0\") " pod="openshift-must-gather-5gk9h/must-gather-4w8kz" Jan 30 22:08:56 crc kubenswrapper[4914]: I0130 22:08:56.546905 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0-must-gather-output\") pod \"must-gather-4w8kz\" (UID: \"3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0\") " pod="openshift-must-gather-5gk9h/must-gather-4w8kz" Jan 30 22:08:56 crc kubenswrapper[4914]: I0130 22:08:56.547140 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbmmt\" (UniqueName: \"kubernetes.io/projected/3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0-kube-api-access-xbmmt\") pod \"must-gather-4w8kz\" (UID: \"3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0\") " pod="openshift-must-gather-5gk9h/must-gather-4w8kz" Jan 30 22:08:56 crc kubenswrapper[4914]: I0130 22:08:56.547504 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0-must-gather-output\") pod \"must-gather-4w8kz\" (UID: \"3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0\") " pod="openshift-must-gather-5gk9h/must-gather-4w8kz" Jan 30 22:08:56 crc kubenswrapper[4914]: I0130 22:08:56.572475 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbmmt\" (UniqueName: \"kubernetes.io/projected/3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0-kube-api-access-xbmmt\") pod \"must-gather-4w8kz\" (UID: \"3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0\") " pod="openshift-must-gather-5gk9h/must-gather-4w8kz" Jan 30 22:08:56 crc kubenswrapper[4914]: I0130 22:08:56.626897 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/must-gather-4w8kz" Jan 30 22:08:57 crc kubenswrapper[4914]: I0130 22:08:57.352204 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5gk9h/must-gather-4w8kz"] Jan 30 22:08:57 crc kubenswrapper[4914]: I0130 22:08:57.682311 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gk9h/must-gather-4w8kz" event={"ID":"3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0","Type":"ContainerStarted","Data":"bc13fca70aa587bbc9d085ea9f0996e7d4c8c7ede95edfc078f11286b6a61a22"} Jan 30 22:08:58 crc kubenswrapper[4914]: I0130 22:08:58.819117 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:08:58 crc kubenswrapper[4914]: E0130 22:08:58.819436 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:09:03 crc kubenswrapper[4914]: I0130 22:09:03.753155 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gk9h/must-gather-4w8kz" event={"ID":"3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0","Type":"ContainerStarted","Data":"ec2d60d06fb69d2da722174e44a6fab6b505a322c4c38903414d014e32743ff3"} Jan 30 22:09:04 crc kubenswrapper[4914]: I0130 22:09:04.772010 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gk9h/must-gather-4w8kz" event={"ID":"3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0","Type":"ContainerStarted","Data":"94313841409ce012de550bd827b78b36f34a05de1b22865084411f73e9a5e51e"} Jan 30 22:09:04 crc kubenswrapper[4914]: I0130 22:09:04.788016 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5gk9h/must-gather-4w8kz" podStartSLOduration=3.000543753 podStartE2EDuration="8.788001493s" podCreationTimestamp="2026-01-30 22:08:56 +0000 UTC" firstStartedPulling="2026-01-30 22:08:57.38508016 +0000 UTC m=+3270.823716921" lastFinishedPulling="2026-01-30 22:09:03.1725379 +0000 UTC m=+3276.611174661" observedRunningTime="2026-01-30 22:09:04.78580832 +0000 UTC m=+3278.224445081" watchObservedRunningTime="2026-01-30 22:09:04.788001493 +0000 UTC m=+3278.226638254" Jan 30 22:09:07 crc kubenswrapper[4914]: I0130 22:09:07.726427 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5gk9h/crc-debug-78lc7"] Jan 30 22:09:07 crc kubenswrapper[4914]: I0130 22:09:07.728166 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/crc-debug-78lc7" Jan 30 22:09:07 crc kubenswrapper[4914]: I0130 22:09:07.793430 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7b5q\" (UniqueName: \"kubernetes.io/projected/026294ce-bd98-4410-8f98-c332b8ff72fd-kube-api-access-j7b5q\") pod \"crc-debug-78lc7\" (UID: \"026294ce-bd98-4410-8f98-c332b8ff72fd\") " pod="openshift-must-gather-5gk9h/crc-debug-78lc7" Jan 30 22:09:07 crc kubenswrapper[4914]: I0130 22:09:07.793502 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/026294ce-bd98-4410-8f98-c332b8ff72fd-host\") pod \"crc-debug-78lc7\" (UID: \"026294ce-bd98-4410-8f98-c332b8ff72fd\") " pod="openshift-must-gather-5gk9h/crc-debug-78lc7" Jan 30 22:09:07 crc kubenswrapper[4914]: I0130 22:09:07.895821 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7b5q\" (UniqueName: \"kubernetes.io/projected/026294ce-bd98-4410-8f98-c332b8ff72fd-kube-api-access-j7b5q\") pod \"crc-debug-78lc7\" (UID: \"026294ce-bd98-4410-8f98-c332b8ff72fd\") " pod="openshift-must-gather-5gk9h/crc-debug-78lc7" Jan 30 22:09:07 crc kubenswrapper[4914]: I0130 22:09:07.896115 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/026294ce-bd98-4410-8f98-c332b8ff72fd-host\") pod \"crc-debug-78lc7\" (UID: \"026294ce-bd98-4410-8f98-c332b8ff72fd\") " pod="openshift-must-gather-5gk9h/crc-debug-78lc7" Jan 30 22:09:07 crc kubenswrapper[4914]: I0130 22:09:07.896223 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/026294ce-bd98-4410-8f98-c332b8ff72fd-host\") pod \"crc-debug-78lc7\" (UID: \"026294ce-bd98-4410-8f98-c332b8ff72fd\") " pod="openshift-must-gather-5gk9h/crc-debug-78lc7" Jan 30 22:09:07 crc kubenswrapper[4914]: I0130 22:09:07.927187 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7b5q\" (UniqueName: \"kubernetes.io/projected/026294ce-bd98-4410-8f98-c332b8ff72fd-kube-api-access-j7b5q\") pod \"crc-debug-78lc7\" (UID: \"026294ce-bd98-4410-8f98-c332b8ff72fd\") " pod="openshift-must-gather-5gk9h/crc-debug-78lc7" Jan 30 22:09:08 crc kubenswrapper[4914]: I0130 22:09:08.053161 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/crc-debug-78lc7" Jan 30 22:09:08 crc kubenswrapper[4914]: I0130 22:09:08.821231 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gk9h/crc-debug-78lc7" event={"ID":"026294ce-bd98-4410-8f98-c332b8ff72fd","Type":"ContainerStarted","Data":"bb96fc61842673a16066fb6eee505e2f3912cbc268580e5c7c0bb82bd3d664ca"} Jan 30 22:09:12 crc kubenswrapper[4914]: I0130 22:09:12.820158 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:09:12 crc kubenswrapper[4914]: E0130 22:09:12.821801 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:09:24 crc kubenswrapper[4914]: I0130 22:09:24.817926 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:09:24 crc kubenswrapper[4914]: E0130 22:09:24.818829 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:09:25 crc kubenswrapper[4914]: I0130 22:09:25.017034 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gk9h/crc-debug-78lc7" event={"ID":"026294ce-bd98-4410-8f98-c332b8ff72fd","Type":"ContainerStarted","Data":"2ec1467c72479f629d162d7d1829fb762555b9a3db9b0116a9d95f436e880a34"} Jan 30 22:09:25 crc kubenswrapper[4914]: I0130 22:09:25.038413 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5gk9h/crc-debug-78lc7" podStartSLOduration=2.409115361 podStartE2EDuration="18.038390949s" podCreationTimestamp="2026-01-30 22:09:07 +0000 UTC" firstStartedPulling="2026-01-30 22:09:08.10421827 +0000 UTC m=+3281.542855031" lastFinishedPulling="2026-01-30 22:09:23.733493858 +0000 UTC m=+3297.172130619" observedRunningTime="2026-01-30 22:09:25.030460434 +0000 UTC m=+3298.469097195" watchObservedRunningTime="2026-01-30 22:09:25.038390949 +0000 UTC m=+3298.477027710" Jan 30 22:09:38 crc kubenswrapper[4914]: I0130 22:09:38.818774 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:09:38 crc kubenswrapper[4914]: E0130 22:09:38.819582 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:09:53 crc kubenswrapper[4914]: I0130 22:09:53.818499 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:09:53 crc kubenswrapper[4914]: E0130 22:09:53.819512 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:10:05 crc kubenswrapper[4914]: I0130 22:10:05.818644 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:10:05 crc kubenswrapper[4914]: E0130 22:10:05.819497 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:10:17 crc kubenswrapper[4914]: I0130 22:10:17.824678 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:10:17 crc kubenswrapper[4914]: E0130 22:10:17.825503 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:10:18 crc kubenswrapper[4914]: I0130 22:10:18.245972 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c8gs5"] Jan 30 22:10:18 crc kubenswrapper[4914]: I0130 22:10:18.255194 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:18 crc kubenswrapper[4914]: I0130 22:10:18.275371 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c8gs5"] Jan 30 22:10:18 crc kubenswrapper[4914]: I0130 22:10:18.428741 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398008a3-a324-4e81-9509-9bfe168aaf48-catalog-content\") pod \"community-operators-c8gs5\" (UID: \"398008a3-a324-4e81-9509-9bfe168aaf48\") " pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:18 crc kubenswrapper[4914]: I0130 22:10:18.428875 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74cb4\" (UniqueName: \"kubernetes.io/projected/398008a3-a324-4e81-9509-9bfe168aaf48-kube-api-access-74cb4\") pod \"community-operators-c8gs5\" (UID: \"398008a3-a324-4e81-9509-9bfe168aaf48\") " pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:18 crc kubenswrapper[4914]: I0130 22:10:18.428908 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398008a3-a324-4e81-9509-9bfe168aaf48-utilities\") pod \"community-operators-c8gs5\" (UID: \"398008a3-a324-4e81-9509-9bfe168aaf48\") " pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:18 crc kubenswrapper[4914]: I0130 22:10:18.531652 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398008a3-a324-4e81-9509-9bfe168aaf48-catalog-content\") pod \"community-operators-c8gs5\" (UID: \"398008a3-a324-4e81-9509-9bfe168aaf48\") " pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:18 crc kubenswrapper[4914]: I0130 22:10:18.531783 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74cb4\" (UniqueName: \"kubernetes.io/projected/398008a3-a324-4e81-9509-9bfe168aaf48-kube-api-access-74cb4\") pod \"community-operators-c8gs5\" (UID: \"398008a3-a324-4e81-9509-9bfe168aaf48\") " pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:18 crc kubenswrapper[4914]: I0130 22:10:18.531812 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398008a3-a324-4e81-9509-9bfe168aaf48-utilities\") pod \"community-operators-c8gs5\" (UID: \"398008a3-a324-4e81-9509-9bfe168aaf48\") " pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:18 crc kubenswrapper[4914]: I0130 22:10:18.532431 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398008a3-a324-4e81-9509-9bfe168aaf48-catalog-content\") pod \"community-operators-c8gs5\" (UID: \"398008a3-a324-4e81-9509-9bfe168aaf48\") " pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:18 crc kubenswrapper[4914]: I0130 22:10:18.532451 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398008a3-a324-4e81-9509-9bfe168aaf48-utilities\") pod \"community-operators-c8gs5\" (UID: \"398008a3-a324-4e81-9509-9bfe168aaf48\") " pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:18 crc kubenswrapper[4914]: I0130 22:10:18.557226 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74cb4\" (UniqueName: \"kubernetes.io/projected/398008a3-a324-4e81-9509-9bfe168aaf48-kube-api-access-74cb4\") pod \"community-operators-c8gs5\" (UID: \"398008a3-a324-4e81-9509-9bfe168aaf48\") " pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:18 crc kubenswrapper[4914]: I0130 22:10:18.581622 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:19 crc kubenswrapper[4914]: I0130 22:10:19.607620 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c8gs5"] Jan 30 22:10:20 crc kubenswrapper[4914]: I0130 22:10:20.526867 4914 generic.go:334] "Generic (PLEG): container finished" podID="398008a3-a324-4e81-9509-9bfe168aaf48" containerID="d2bb11acd9c302a691f45d159e2eb7aff4aea27a875d61104518f24d24e394d1" exitCode=0 Jan 30 22:10:20 crc kubenswrapper[4914]: I0130 22:10:20.527170 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8gs5" event={"ID":"398008a3-a324-4e81-9509-9bfe168aaf48","Type":"ContainerDied","Data":"d2bb11acd9c302a691f45d159e2eb7aff4aea27a875d61104518f24d24e394d1"} Jan 30 22:10:20 crc kubenswrapper[4914]: I0130 22:10:20.527310 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8gs5" event={"ID":"398008a3-a324-4e81-9509-9bfe168aaf48","Type":"ContainerStarted","Data":"b39008ec914d5ac71b0f844adeefe345db8aeaff4af8e54700cc5e3295ec0a59"} Jan 30 22:10:21 crc kubenswrapper[4914]: I0130 22:10:21.538521 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8gs5" event={"ID":"398008a3-a324-4e81-9509-9bfe168aaf48","Type":"ContainerStarted","Data":"16cf38ed48ea39504f625bbebbbd97a166f7aeb3737197309fbbd778d1ac98f6"} Jan 30 22:10:24 crc kubenswrapper[4914]: I0130 22:10:24.579372 4914 generic.go:334] "Generic (PLEG): container finished" podID="398008a3-a324-4e81-9509-9bfe168aaf48" containerID="16cf38ed48ea39504f625bbebbbd97a166f7aeb3737197309fbbd778d1ac98f6" exitCode=0 Jan 30 22:10:24 crc kubenswrapper[4914]: I0130 22:10:24.579427 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8gs5" event={"ID":"398008a3-a324-4e81-9509-9bfe168aaf48","Type":"ContainerDied","Data":"16cf38ed48ea39504f625bbebbbd97a166f7aeb3737197309fbbd778d1ac98f6"} Jan 30 22:10:25 crc kubenswrapper[4914]: I0130 22:10:25.589934 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8gs5" event={"ID":"398008a3-a324-4e81-9509-9bfe168aaf48","Type":"ContainerStarted","Data":"a784a203cfcabeddf59d28233f9ef5d98b78844e635a61a42a34973803bb00d0"} Jan 30 22:10:25 crc kubenswrapper[4914]: I0130 22:10:25.625019 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c8gs5" podStartSLOduration=3.152185126 podStartE2EDuration="7.624996011s" podCreationTimestamp="2026-01-30 22:10:18 +0000 UTC" firstStartedPulling="2026-01-30 22:10:20.533388222 +0000 UTC m=+3353.972024983" lastFinishedPulling="2026-01-30 22:10:25.006199107 +0000 UTC m=+3358.444835868" observedRunningTime="2026-01-30 22:10:25.613085638 +0000 UTC m=+3359.051722419" watchObservedRunningTime="2026-01-30 22:10:25.624996011 +0000 UTC m=+3359.063632772" Jan 30 22:10:28 crc kubenswrapper[4914]: I0130 22:10:28.582439 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:28 crc kubenswrapper[4914]: I0130 22:10:28.585291 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:29 crc kubenswrapper[4914]: I0130 22:10:29.649279 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-c8gs5" podUID="398008a3-a324-4e81-9509-9bfe168aaf48" containerName="registry-server" probeResult="failure" output=< Jan 30 22:10:29 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:10:29 crc kubenswrapper[4914]: > Jan 30 22:10:30 crc kubenswrapper[4914]: I0130 22:10:30.648190 4914 generic.go:334] "Generic (PLEG): container finished" podID="026294ce-bd98-4410-8f98-c332b8ff72fd" containerID="2ec1467c72479f629d162d7d1829fb762555b9a3db9b0116a9d95f436e880a34" exitCode=0 Jan 30 22:10:30 crc kubenswrapper[4914]: I0130 22:10:30.648295 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gk9h/crc-debug-78lc7" event={"ID":"026294ce-bd98-4410-8f98-c332b8ff72fd","Type":"ContainerDied","Data":"2ec1467c72479f629d162d7d1829fb762555b9a3db9b0116a9d95f436e880a34"} Jan 30 22:10:30 crc kubenswrapper[4914]: I0130 22:10:30.818998 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:10:30 crc kubenswrapper[4914]: E0130 22:10:30.819310 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:10:31 crc kubenswrapper[4914]: I0130 22:10:31.792896 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/crc-debug-78lc7" Jan 30 22:10:31 crc kubenswrapper[4914]: I0130 22:10:31.839960 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5gk9h/crc-debug-78lc7"] Jan 30 22:10:31 crc kubenswrapper[4914]: I0130 22:10:31.852285 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5gk9h/crc-debug-78lc7"] Jan 30 22:10:31 crc kubenswrapper[4914]: I0130 22:10:31.930542 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7b5q\" (UniqueName: \"kubernetes.io/projected/026294ce-bd98-4410-8f98-c332b8ff72fd-kube-api-access-j7b5q\") pod \"026294ce-bd98-4410-8f98-c332b8ff72fd\" (UID: \"026294ce-bd98-4410-8f98-c332b8ff72fd\") " Jan 30 22:10:31 crc kubenswrapper[4914]: I0130 22:10:31.930832 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/026294ce-bd98-4410-8f98-c332b8ff72fd-host\") pod \"026294ce-bd98-4410-8f98-c332b8ff72fd\" (UID: \"026294ce-bd98-4410-8f98-c332b8ff72fd\") " Jan 30 22:10:31 crc kubenswrapper[4914]: I0130 22:10:31.930923 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/026294ce-bd98-4410-8f98-c332b8ff72fd-host" (OuterVolumeSpecName: "host") pod "026294ce-bd98-4410-8f98-c332b8ff72fd" (UID: "026294ce-bd98-4410-8f98-c332b8ff72fd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 22:10:31 crc kubenswrapper[4914]: I0130 22:10:31.933125 4914 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/026294ce-bd98-4410-8f98-c332b8ff72fd-host\") on node \"crc\" DevicePath \"\"" Jan 30 22:10:31 crc kubenswrapper[4914]: I0130 22:10:31.942738 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/026294ce-bd98-4410-8f98-c332b8ff72fd-kube-api-access-j7b5q" (OuterVolumeSpecName: "kube-api-access-j7b5q") pod "026294ce-bd98-4410-8f98-c332b8ff72fd" (UID: "026294ce-bd98-4410-8f98-c332b8ff72fd"). InnerVolumeSpecName "kube-api-access-j7b5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:10:32 crc kubenswrapper[4914]: I0130 22:10:32.035548 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7b5q\" (UniqueName: \"kubernetes.io/projected/026294ce-bd98-4410-8f98-c332b8ff72fd-kube-api-access-j7b5q\") on node \"crc\" DevicePath \"\"" Jan 30 22:10:32 crc kubenswrapper[4914]: I0130 22:10:32.669374 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb96fc61842673a16066fb6eee505e2f3912cbc268580e5c7c0bb82bd3d664ca" Jan 30 22:10:32 crc kubenswrapper[4914]: I0130 22:10:32.669419 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/crc-debug-78lc7" Jan 30 22:10:33 crc kubenswrapper[4914]: I0130 22:10:33.097742 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5gk9h/crc-debug-82gm2"] Jan 30 22:10:33 crc kubenswrapper[4914]: E0130 22:10:33.098782 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="026294ce-bd98-4410-8f98-c332b8ff72fd" containerName="container-00" Jan 30 22:10:33 crc kubenswrapper[4914]: I0130 22:10:33.098795 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="026294ce-bd98-4410-8f98-c332b8ff72fd" containerName="container-00" Jan 30 22:10:33 crc kubenswrapper[4914]: I0130 22:10:33.099192 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="026294ce-bd98-4410-8f98-c332b8ff72fd" containerName="container-00" Jan 30 22:10:33 crc kubenswrapper[4914]: I0130 22:10:33.105466 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/crc-debug-82gm2" Jan 30 22:10:33 crc kubenswrapper[4914]: I0130 22:10:33.256838 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwwj2\" (UniqueName: \"kubernetes.io/projected/b7573603-ebf2-4c3a-8bd3-aa7bb3771a01-kube-api-access-vwwj2\") pod \"crc-debug-82gm2\" (UID: \"b7573603-ebf2-4c3a-8bd3-aa7bb3771a01\") " pod="openshift-must-gather-5gk9h/crc-debug-82gm2" Jan 30 22:10:33 crc kubenswrapper[4914]: I0130 22:10:33.257022 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7573603-ebf2-4c3a-8bd3-aa7bb3771a01-host\") pod \"crc-debug-82gm2\" (UID: \"b7573603-ebf2-4c3a-8bd3-aa7bb3771a01\") " pod="openshift-must-gather-5gk9h/crc-debug-82gm2" Jan 30 22:10:33 crc kubenswrapper[4914]: I0130 22:10:33.358385 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7573603-ebf2-4c3a-8bd3-aa7bb3771a01-host\") pod \"crc-debug-82gm2\" (UID: \"b7573603-ebf2-4c3a-8bd3-aa7bb3771a01\") " pod="openshift-must-gather-5gk9h/crc-debug-82gm2" Jan 30 22:10:33 crc kubenswrapper[4914]: I0130 22:10:33.358516 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwwj2\" (UniqueName: \"kubernetes.io/projected/b7573603-ebf2-4c3a-8bd3-aa7bb3771a01-kube-api-access-vwwj2\") pod \"crc-debug-82gm2\" (UID: \"b7573603-ebf2-4c3a-8bd3-aa7bb3771a01\") " pod="openshift-must-gather-5gk9h/crc-debug-82gm2" Jan 30 22:10:33 crc kubenswrapper[4914]: I0130 22:10:33.358512 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7573603-ebf2-4c3a-8bd3-aa7bb3771a01-host\") pod \"crc-debug-82gm2\" (UID: \"b7573603-ebf2-4c3a-8bd3-aa7bb3771a01\") " pod="openshift-must-gather-5gk9h/crc-debug-82gm2" Jan 30 22:10:33 crc kubenswrapper[4914]: I0130 22:10:33.378735 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwwj2\" (UniqueName: \"kubernetes.io/projected/b7573603-ebf2-4c3a-8bd3-aa7bb3771a01-kube-api-access-vwwj2\") pod \"crc-debug-82gm2\" (UID: \"b7573603-ebf2-4c3a-8bd3-aa7bb3771a01\") " pod="openshift-must-gather-5gk9h/crc-debug-82gm2" Jan 30 22:10:33 crc kubenswrapper[4914]: I0130 22:10:33.429760 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/crc-debug-82gm2" Jan 30 22:10:33 crc kubenswrapper[4914]: I0130 22:10:33.679381 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gk9h/crc-debug-82gm2" event={"ID":"b7573603-ebf2-4c3a-8bd3-aa7bb3771a01","Type":"ContainerStarted","Data":"b951bef1c71dbf9de6f6a7d441d4afcd931418715a0c150dd4fb7a9d996a9a56"} Jan 30 22:10:33 crc kubenswrapper[4914]: I0130 22:10:33.830565 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="026294ce-bd98-4410-8f98-c332b8ff72fd" path="/var/lib/kubelet/pods/026294ce-bd98-4410-8f98-c332b8ff72fd/volumes" Jan 30 22:10:34 crc kubenswrapper[4914]: I0130 22:10:34.691581 4914 generic.go:334] "Generic (PLEG): container finished" podID="b7573603-ebf2-4c3a-8bd3-aa7bb3771a01" containerID="2642efdcafc7bbfd193144bdf702365e42ba636ae59305303e4a071216110202" exitCode=0 Jan 30 22:10:34 crc kubenswrapper[4914]: I0130 22:10:34.691699 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gk9h/crc-debug-82gm2" event={"ID":"b7573603-ebf2-4c3a-8bd3-aa7bb3771a01","Type":"ContainerDied","Data":"2642efdcafc7bbfd193144bdf702365e42ba636ae59305303e4a071216110202"} Jan 30 22:10:35 crc kubenswrapper[4914]: I0130 22:10:35.823479 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/crc-debug-82gm2" Jan 30 22:10:35 crc kubenswrapper[4914]: I0130 22:10:35.906839 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7573603-ebf2-4c3a-8bd3-aa7bb3771a01-host\") pod \"b7573603-ebf2-4c3a-8bd3-aa7bb3771a01\" (UID: \"b7573603-ebf2-4c3a-8bd3-aa7bb3771a01\") " Jan 30 22:10:35 crc kubenswrapper[4914]: I0130 22:10:35.906968 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwwj2\" (UniqueName: \"kubernetes.io/projected/b7573603-ebf2-4c3a-8bd3-aa7bb3771a01-kube-api-access-vwwj2\") pod \"b7573603-ebf2-4c3a-8bd3-aa7bb3771a01\" (UID: \"b7573603-ebf2-4c3a-8bd3-aa7bb3771a01\") " Jan 30 22:10:35 crc kubenswrapper[4914]: I0130 22:10:35.907051 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7573603-ebf2-4c3a-8bd3-aa7bb3771a01-host" (OuterVolumeSpecName: "host") pod "b7573603-ebf2-4c3a-8bd3-aa7bb3771a01" (UID: "b7573603-ebf2-4c3a-8bd3-aa7bb3771a01"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 22:10:35 crc kubenswrapper[4914]: I0130 22:10:35.907600 4914 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7573603-ebf2-4c3a-8bd3-aa7bb3771a01-host\") on node \"crc\" DevicePath \"\"" Jan 30 22:10:35 crc kubenswrapper[4914]: I0130 22:10:35.914068 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7573603-ebf2-4c3a-8bd3-aa7bb3771a01-kube-api-access-vwwj2" (OuterVolumeSpecName: "kube-api-access-vwwj2") pod "b7573603-ebf2-4c3a-8bd3-aa7bb3771a01" (UID: "b7573603-ebf2-4c3a-8bd3-aa7bb3771a01"). InnerVolumeSpecName "kube-api-access-vwwj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:10:36 crc kubenswrapper[4914]: I0130 22:10:36.009064 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwwj2\" (UniqueName: \"kubernetes.io/projected/b7573603-ebf2-4c3a-8bd3-aa7bb3771a01-kube-api-access-vwwj2\") on node \"crc\" DevicePath \"\"" Jan 30 22:10:36 crc kubenswrapper[4914]: I0130 22:10:36.074565 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5gk9h/crc-debug-82gm2"] Jan 30 22:10:36 crc kubenswrapper[4914]: I0130 22:10:36.083630 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5gk9h/crc-debug-82gm2"] Jan 30 22:10:36 crc kubenswrapper[4914]: I0130 22:10:36.718089 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b951bef1c71dbf9de6f6a7d441d4afcd931418715a0c150dd4fb7a9d996a9a56" Jan 30 22:10:36 crc kubenswrapper[4914]: I0130 22:10:36.718325 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/crc-debug-82gm2" Jan 30 22:10:36 crc kubenswrapper[4914]: E0130 22:10:36.920560 4914 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7573603_ebf2_4c3a_8bd3_aa7bb3771a01.slice/crio-b951bef1c71dbf9de6f6a7d441d4afcd931418715a0c150dd4fb7a9d996a9a56\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7573603_ebf2_4c3a_8bd3_aa7bb3771a01.slice\": RecentStats: unable to find data in memory cache]" Jan 30 22:10:37 crc kubenswrapper[4914]: I0130 22:10:37.259063 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5gk9h/crc-debug-jjv9g"] Jan 30 22:10:37 crc kubenswrapper[4914]: E0130 22:10:37.260110 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7573603-ebf2-4c3a-8bd3-aa7bb3771a01" containerName="container-00" Jan 30 22:10:37 crc kubenswrapper[4914]: I0130 22:10:37.260136 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7573603-ebf2-4c3a-8bd3-aa7bb3771a01" containerName="container-00" Jan 30 22:10:37 crc kubenswrapper[4914]: I0130 22:10:37.260444 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7573603-ebf2-4c3a-8bd3-aa7bb3771a01" containerName="container-00" Jan 30 22:10:37 crc kubenswrapper[4914]: I0130 22:10:37.261504 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/crc-debug-jjv9g" Jan 30 22:10:37 crc kubenswrapper[4914]: I0130 22:10:37.337305 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkl9g\" (UniqueName: \"kubernetes.io/projected/5a7acfda-c7d9-4c01-a780-acbb6f88c418-kube-api-access-gkl9g\") pod \"crc-debug-jjv9g\" (UID: \"5a7acfda-c7d9-4c01-a780-acbb6f88c418\") " pod="openshift-must-gather-5gk9h/crc-debug-jjv9g" Jan 30 22:10:37 crc kubenswrapper[4914]: I0130 22:10:37.337638 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a7acfda-c7d9-4c01-a780-acbb6f88c418-host\") pod \"crc-debug-jjv9g\" (UID: \"5a7acfda-c7d9-4c01-a780-acbb6f88c418\") " pod="openshift-must-gather-5gk9h/crc-debug-jjv9g" Jan 30 22:10:37 crc kubenswrapper[4914]: I0130 22:10:37.442752 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a7acfda-c7d9-4c01-a780-acbb6f88c418-host\") pod \"crc-debug-jjv9g\" (UID: \"5a7acfda-c7d9-4c01-a780-acbb6f88c418\") " pod="openshift-must-gather-5gk9h/crc-debug-jjv9g" Jan 30 22:10:37 crc kubenswrapper[4914]: I0130 22:10:37.442868 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkl9g\" (UniqueName: \"kubernetes.io/projected/5a7acfda-c7d9-4c01-a780-acbb6f88c418-kube-api-access-gkl9g\") pod \"crc-debug-jjv9g\" (UID: \"5a7acfda-c7d9-4c01-a780-acbb6f88c418\") " pod="openshift-must-gather-5gk9h/crc-debug-jjv9g" Jan 30 22:10:37 crc kubenswrapper[4914]: I0130 22:10:37.442875 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a7acfda-c7d9-4c01-a780-acbb6f88c418-host\") pod \"crc-debug-jjv9g\" (UID: \"5a7acfda-c7d9-4c01-a780-acbb6f88c418\") " pod="openshift-must-gather-5gk9h/crc-debug-jjv9g" Jan 30 22:10:37 crc kubenswrapper[4914]: I0130 22:10:37.472728 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkl9g\" (UniqueName: \"kubernetes.io/projected/5a7acfda-c7d9-4c01-a780-acbb6f88c418-kube-api-access-gkl9g\") pod \"crc-debug-jjv9g\" (UID: \"5a7acfda-c7d9-4c01-a780-acbb6f88c418\") " pod="openshift-must-gather-5gk9h/crc-debug-jjv9g" Jan 30 22:10:37 crc kubenswrapper[4914]: I0130 22:10:37.578423 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/crc-debug-jjv9g" Jan 30 22:10:37 crc kubenswrapper[4914]: W0130 22:10:37.601885 4914 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a7acfda_c7d9_4c01_a780_acbb6f88c418.slice/crio-687f3bf32cd184fcf4b9cb527fe92ff93dca9fa54e02108486a268cf02dc7e00 WatchSource:0}: Error finding container 687f3bf32cd184fcf4b9cb527fe92ff93dca9fa54e02108486a268cf02dc7e00: Status 404 returned error can't find the container with id 687f3bf32cd184fcf4b9cb527fe92ff93dca9fa54e02108486a268cf02dc7e00 Jan 30 22:10:37 crc kubenswrapper[4914]: I0130 22:10:37.729534 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gk9h/crc-debug-jjv9g" event={"ID":"5a7acfda-c7d9-4c01-a780-acbb6f88c418","Type":"ContainerStarted","Data":"687f3bf32cd184fcf4b9cb527fe92ff93dca9fa54e02108486a268cf02dc7e00"} Jan 30 22:10:37 crc kubenswrapper[4914]: I0130 22:10:37.838408 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7573603-ebf2-4c3a-8bd3-aa7bb3771a01" path="/var/lib/kubelet/pods/b7573603-ebf2-4c3a-8bd3-aa7bb3771a01/volumes" Jan 30 22:10:38 crc kubenswrapper[4914]: I0130 22:10:38.636619 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:38 crc kubenswrapper[4914]: I0130 22:10:38.697639 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:38 crc kubenswrapper[4914]: I0130 22:10:38.740642 4914 generic.go:334] "Generic (PLEG): container finished" podID="5a7acfda-c7d9-4c01-a780-acbb6f88c418" containerID="4de2e90289340608b1276844cddf320e5d54ed022d3030c92f7e0f05bf304d83" exitCode=0 Jan 30 22:10:38 crc kubenswrapper[4914]: I0130 22:10:38.740689 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gk9h/crc-debug-jjv9g" event={"ID":"5a7acfda-c7d9-4c01-a780-acbb6f88c418","Type":"ContainerDied","Data":"4de2e90289340608b1276844cddf320e5d54ed022d3030c92f7e0f05bf304d83"} Jan 30 22:10:38 crc kubenswrapper[4914]: I0130 22:10:38.781797 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5gk9h/crc-debug-jjv9g"] Jan 30 22:10:38 crc kubenswrapper[4914]: I0130 22:10:38.792618 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5gk9h/crc-debug-jjv9g"] Jan 30 22:10:38 crc kubenswrapper[4914]: I0130 22:10:38.878557 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c8gs5"] Jan 30 22:10:39 crc kubenswrapper[4914]: I0130 22:10:39.748846 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c8gs5" podUID="398008a3-a324-4e81-9509-9bfe168aaf48" containerName="registry-server" containerID="cri-o://a784a203cfcabeddf59d28233f9ef5d98b78844e635a61a42a34973803bb00d0" gracePeriod=2 Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.010318 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/crc-debug-jjv9g" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.094763 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkl9g\" (UniqueName: \"kubernetes.io/projected/5a7acfda-c7d9-4c01-a780-acbb6f88c418-kube-api-access-gkl9g\") pod \"5a7acfda-c7d9-4c01-a780-acbb6f88c418\" (UID: \"5a7acfda-c7d9-4c01-a780-acbb6f88c418\") " Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.094825 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a7acfda-c7d9-4c01-a780-acbb6f88c418-host\") pod \"5a7acfda-c7d9-4c01-a780-acbb6f88c418\" (UID: \"5a7acfda-c7d9-4c01-a780-acbb6f88c418\") " Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.095582 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a7acfda-c7d9-4c01-a780-acbb6f88c418-host" (OuterVolumeSpecName: "host") pod "5a7acfda-c7d9-4c01-a780-acbb6f88c418" (UID: "5a7acfda-c7d9-4c01-a780-acbb6f88c418"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.105602 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a7acfda-c7d9-4c01-a780-acbb6f88c418-kube-api-access-gkl9g" (OuterVolumeSpecName: "kube-api-access-gkl9g") pod "5a7acfda-c7d9-4c01-a780-acbb6f88c418" (UID: "5a7acfda-c7d9-4c01-a780-acbb6f88c418"). InnerVolumeSpecName "kube-api-access-gkl9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.198457 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkl9g\" (UniqueName: \"kubernetes.io/projected/5a7acfda-c7d9-4c01-a780-acbb6f88c418-kube-api-access-gkl9g\") on node \"crc\" DevicePath \"\"" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.198501 4914 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a7acfda-c7d9-4c01-a780-acbb6f88c418-host\") on node \"crc\" DevicePath \"\"" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.561886 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.708256 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398008a3-a324-4e81-9509-9bfe168aaf48-catalog-content\") pod \"398008a3-a324-4e81-9509-9bfe168aaf48\" (UID: \"398008a3-a324-4e81-9509-9bfe168aaf48\") " Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.708448 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74cb4\" (UniqueName: \"kubernetes.io/projected/398008a3-a324-4e81-9509-9bfe168aaf48-kube-api-access-74cb4\") pod \"398008a3-a324-4e81-9509-9bfe168aaf48\" (UID: \"398008a3-a324-4e81-9509-9bfe168aaf48\") " Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.708488 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398008a3-a324-4e81-9509-9bfe168aaf48-utilities\") pod \"398008a3-a324-4e81-9509-9bfe168aaf48\" (UID: \"398008a3-a324-4e81-9509-9bfe168aaf48\") " Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.709343 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398008a3-a324-4e81-9509-9bfe168aaf48-utilities" (OuterVolumeSpecName: "utilities") pod "398008a3-a324-4e81-9509-9bfe168aaf48" (UID: "398008a3-a324-4e81-9509-9bfe168aaf48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.709797 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398008a3-a324-4e81-9509-9bfe168aaf48-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.714954 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398008a3-a324-4e81-9509-9bfe168aaf48-kube-api-access-74cb4" (OuterVolumeSpecName: "kube-api-access-74cb4") pod "398008a3-a324-4e81-9509-9bfe168aaf48" (UID: "398008a3-a324-4e81-9509-9bfe168aaf48"). InnerVolumeSpecName "kube-api-access-74cb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.765020 4914 generic.go:334] "Generic (PLEG): container finished" podID="398008a3-a324-4e81-9509-9bfe168aaf48" containerID="a784a203cfcabeddf59d28233f9ef5d98b78844e635a61a42a34973803bb00d0" exitCode=0 Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.765106 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8gs5" event={"ID":"398008a3-a324-4e81-9509-9bfe168aaf48","Type":"ContainerDied","Data":"a784a203cfcabeddf59d28233f9ef5d98b78844e635a61a42a34973803bb00d0"} Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.765140 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8gs5" event={"ID":"398008a3-a324-4e81-9509-9bfe168aaf48","Type":"ContainerDied","Data":"b39008ec914d5ac71b0f844adeefe345db8aeaff4af8e54700cc5e3295ec0a59"} Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.765163 4914 scope.go:117] "RemoveContainer" containerID="a784a203cfcabeddf59d28233f9ef5d98b78844e635a61a42a34973803bb00d0" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.766337 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8gs5" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.766813 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/crc-debug-jjv9g" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.772279 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398008a3-a324-4e81-9509-9bfe168aaf48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "398008a3-a324-4e81-9509-9bfe168aaf48" (UID: "398008a3-a324-4e81-9509-9bfe168aaf48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.811731 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398008a3-a324-4e81-9509-9bfe168aaf48-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.811772 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74cb4\" (UniqueName: \"kubernetes.io/projected/398008a3-a324-4e81-9509-9bfe168aaf48-kube-api-access-74cb4\") on node \"crc\" DevicePath \"\"" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.818322 4914 scope.go:117] "RemoveContainer" containerID="16cf38ed48ea39504f625bbebbbd97a166f7aeb3737197309fbbd778d1ac98f6" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.840549 4914 scope.go:117] "RemoveContainer" containerID="d2bb11acd9c302a691f45d159e2eb7aff4aea27a875d61104518f24d24e394d1" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.886656 4914 scope.go:117] "RemoveContainer" containerID="a784a203cfcabeddf59d28233f9ef5d98b78844e635a61a42a34973803bb00d0" Jan 30 22:10:40 crc kubenswrapper[4914]: E0130 22:10:40.887194 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a784a203cfcabeddf59d28233f9ef5d98b78844e635a61a42a34973803bb00d0\": container with ID starting with a784a203cfcabeddf59d28233f9ef5d98b78844e635a61a42a34973803bb00d0 not found: ID does not exist" containerID="a784a203cfcabeddf59d28233f9ef5d98b78844e635a61a42a34973803bb00d0" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.887240 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a784a203cfcabeddf59d28233f9ef5d98b78844e635a61a42a34973803bb00d0"} err="failed to get container status \"a784a203cfcabeddf59d28233f9ef5d98b78844e635a61a42a34973803bb00d0\": rpc error: code = NotFound desc = could not find container \"a784a203cfcabeddf59d28233f9ef5d98b78844e635a61a42a34973803bb00d0\": container with ID starting with a784a203cfcabeddf59d28233f9ef5d98b78844e635a61a42a34973803bb00d0 not found: ID does not exist" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.887266 4914 scope.go:117] "RemoveContainer" containerID="16cf38ed48ea39504f625bbebbbd97a166f7aeb3737197309fbbd778d1ac98f6" Jan 30 22:10:40 crc kubenswrapper[4914]: E0130 22:10:40.887687 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16cf38ed48ea39504f625bbebbbd97a166f7aeb3737197309fbbd778d1ac98f6\": container with ID starting with 16cf38ed48ea39504f625bbebbbd97a166f7aeb3737197309fbbd778d1ac98f6 not found: ID does not exist" containerID="16cf38ed48ea39504f625bbebbbd97a166f7aeb3737197309fbbd778d1ac98f6" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.887729 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16cf38ed48ea39504f625bbebbbd97a166f7aeb3737197309fbbd778d1ac98f6"} err="failed to get container status \"16cf38ed48ea39504f625bbebbbd97a166f7aeb3737197309fbbd778d1ac98f6\": rpc error: code = NotFound desc = could not find container \"16cf38ed48ea39504f625bbebbbd97a166f7aeb3737197309fbbd778d1ac98f6\": container with ID starting with 16cf38ed48ea39504f625bbebbbd97a166f7aeb3737197309fbbd778d1ac98f6 not found: ID does not exist" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.887748 4914 scope.go:117] "RemoveContainer" containerID="d2bb11acd9c302a691f45d159e2eb7aff4aea27a875d61104518f24d24e394d1" Jan 30 22:10:40 crc kubenswrapper[4914]: E0130 22:10:40.888005 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2bb11acd9c302a691f45d159e2eb7aff4aea27a875d61104518f24d24e394d1\": container with ID starting with d2bb11acd9c302a691f45d159e2eb7aff4aea27a875d61104518f24d24e394d1 not found: ID does not exist" containerID="d2bb11acd9c302a691f45d159e2eb7aff4aea27a875d61104518f24d24e394d1" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.888050 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2bb11acd9c302a691f45d159e2eb7aff4aea27a875d61104518f24d24e394d1"} err="failed to get container status \"d2bb11acd9c302a691f45d159e2eb7aff4aea27a875d61104518f24d24e394d1\": rpc error: code = NotFound desc = could not find container \"d2bb11acd9c302a691f45d159e2eb7aff4aea27a875d61104518f24d24e394d1\": container with ID starting with d2bb11acd9c302a691f45d159e2eb7aff4aea27a875d61104518f24d24e394d1 not found: ID does not exist" Jan 30 22:10:40 crc kubenswrapper[4914]: I0130 22:10:40.888079 4914 scope.go:117] "RemoveContainer" containerID="4de2e90289340608b1276844cddf320e5d54ed022d3030c92f7e0f05bf304d83" Jan 30 22:10:41 crc kubenswrapper[4914]: I0130 22:10:41.102041 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c8gs5"] Jan 30 22:10:41 crc kubenswrapper[4914]: I0130 22:10:41.111314 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c8gs5"] Jan 30 22:10:41 crc kubenswrapper[4914]: I0130 22:10:41.818410 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:10:41 crc kubenswrapper[4914]: E0130 22:10:41.818796 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:10:41 crc kubenswrapper[4914]: I0130 22:10:41.830506 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398008a3-a324-4e81-9509-9bfe168aaf48" path="/var/lib/kubelet/pods/398008a3-a324-4e81-9509-9bfe168aaf48/volumes" Jan 30 22:10:41 crc kubenswrapper[4914]: I0130 22:10:41.831515 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a7acfda-c7d9-4c01-a780-acbb6f88c418" path="/var/lib/kubelet/pods/5a7acfda-c7d9-4c01-a780-acbb6f88c418/volumes" Jan 30 22:10:56 crc kubenswrapper[4914]: I0130 22:10:56.818813 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:10:56 crc kubenswrapper[4914]: E0130 22:10:56.819624 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:11:07 crc kubenswrapper[4914]: I0130 22:11:07.418616 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_46107121-a72c-40a7-904c-24c6c33de7c4/init-config-reloader/0.log" Jan 30 22:11:07 crc kubenswrapper[4914]: I0130 22:11:07.620850 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_46107121-a72c-40a7-904c-24c6c33de7c4/config-reloader/0.log" Jan 30 22:11:07 crc kubenswrapper[4914]: I0130 22:11:07.677188 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_46107121-a72c-40a7-904c-24c6c33de7c4/init-config-reloader/0.log" Jan 30 22:11:07 crc kubenswrapper[4914]: I0130 22:11:07.754482 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_46107121-a72c-40a7-904c-24c6c33de7c4/alertmanager/0.log" Jan 30 22:11:07 crc kubenswrapper[4914]: I0130 22:11:07.927986 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-975f98546-d2x5z_b076a462-e37e-496a-9587-78a4f9b07232/barbican-api/0.log" Jan 30 22:11:07 crc kubenswrapper[4914]: I0130 22:11:07.999128 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-975f98546-d2x5z_b076a462-e37e-496a-9587-78a4f9b07232/barbican-api-log/0.log" Jan 30 22:11:08 crc kubenswrapper[4914]: I0130 22:11:08.037059 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-d84ffccf8-2q5ts_37896a03-ab80-432f-b7b0-490652061464/barbican-keystone-listener/0.log" Jan 30 22:11:08 crc kubenswrapper[4914]: I0130 22:11:08.247528 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-d84ffccf8-2q5ts_37896a03-ab80-432f-b7b0-490652061464/barbican-keystone-listener-log/0.log" Jan 30 22:11:08 crc kubenswrapper[4914]: I0130 22:11:08.293872 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d6d6d58ff-ngrc6_3c844063-c103-4c6e-92ae-d9f1e0e897eb/barbican-worker/0.log" Jan 30 22:11:08 crc kubenswrapper[4914]: I0130 22:11:08.316806 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d6d6d58ff-ngrc6_3c844063-c103-4c6e-92ae-d9f1e0e897eb/barbican-worker-log/0.log" Jan 30 22:11:08 crc kubenswrapper[4914]: I0130 22:11:08.510913 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-g6dkm_eed9f005-df08-43c9-b8e8-cd334d777714/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:08 crc kubenswrapper[4914]: I0130 22:11:08.637654 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_49463c19-f32a-4288-9a5a-51d9c7b11e42/ceilometer-central-agent/0.log" Jan 30 22:11:08 crc kubenswrapper[4914]: I0130 22:11:08.723431 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_49463c19-f32a-4288-9a5a-51d9c7b11e42/proxy-httpd/0.log" Jan 30 22:11:08 crc kubenswrapper[4914]: I0130 22:11:08.776004 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_49463c19-f32a-4288-9a5a-51d9c7b11e42/ceilometer-notification-agent/0.log" Jan 30 22:11:08 crc kubenswrapper[4914]: I0130 22:11:08.818003 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:11:08 crc kubenswrapper[4914]: E0130 22:11:08.818288 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:11:08 crc kubenswrapper[4914]: I0130 22:11:08.839647 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_49463c19-f32a-4288-9a5a-51d9c7b11e42/sg-core/0.log" Jan 30 22:11:09 crc kubenswrapper[4914]: I0130 22:11:09.075137 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c284840e-6355-4145-9853-723a3d280963/cinder-api-log/0.log" Jan 30 22:11:09 crc kubenswrapper[4914]: I0130 22:11:09.088781 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c284840e-6355-4145-9853-723a3d280963/cinder-api/0.log" Jan 30 22:11:09 crc kubenswrapper[4914]: I0130 22:11:09.242674 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d44aada2-0a99-4783-89f2-55ccde6477d7/cinder-scheduler/0.log" Jan 30 22:11:09 crc kubenswrapper[4914]: I0130 22:11:09.290439 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d44aada2-0a99-4783-89f2-55ccde6477d7/probe/0.log" Jan 30 22:11:09 crc kubenswrapper[4914]: I0130 22:11:09.434655 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_3fb76217-e54d-437c-91a9-170a095719ee/cloudkitty-api/0.log" Jan 30 22:11:09 crc kubenswrapper[4914]: I0130 22:11:09.494510 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_3fb76217-e54d-437c-91a9-170a095719ee/cloudkitty-api-log/0.log" Jan 30 22:11:09 crc kubenswrapper[4914]: I0130 22:11:09.651501 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_1cd64ca8-c110-4af1-ad2e-edbed561a3b3/loki-compactor/0.log" Jan 30 22:11:09 crc kubenswrapper[4914]: I0130 22:11:09.750807 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-66dfd9bb-5wh44_915fbbd9-20c9-4552-bf18-a61af008b1d8/loki-distributor/0.log" Jan 30 22:11:09 crc kubenswrapper[4914]: I0130 22:11:09.856560 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7db4f4db8c-8nmcj_2954a978-cc4d-4e5a-95af-d3bab9a9b3d1/gateway/0.log" Jan 30 22:11:10 crc kubenswrapper[4914]: I0130 22:11:10.023818 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7db4f4db8c-9z4mv_844bac7f-9f50-49c2-a05c-963b99ca4490/gateway/0.log" Jan 30 22:11:10 crc kubenswrapper[4914]: I0130 22:11:10.303508 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_8b4ebe0e-413b-4b5e-9239-a946ce2ca0f5/loki-index-gateway/0.log" Jan 30 22:11:10 crc kubenswrapper[4914]: I0130 22:11:10.349876 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_79bbc9b2-1d6f-4d07-bf58-ba44f0e717b0/loki-ingester/0.log" Jan 30 22:11:10 crc kubenswrapper[4914]: I0130 22:11:10.573369 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-5cd44666df-g6p7k_c2060bc5-fb2c-4421-b6a0-7acbd5549c8d/loki-query-frontend/0.log" Jan 30 22:11:10 crc kubenswrapper[4914]: I0130 22:11:10.688658 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-795fd8f8cc-vq9hr_e528e0c0-c547-4d1d-8624-f8b2c8d450cf/loki-querier/0.log" Jan 30 22:11:11 crc kubenswrapper[4914]: I0130 22:11:11.154010 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-58t6k_e9790abb-7691-489b-a30b-84738f413edc/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:11 crc kubenswrapper[4914]: I0130 22:11:11.529268 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ztwk4_9717f88f-15d4-4b1d-93dd-dc656e5c64f6/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:11 crc kubenswrapper[4914]: I0130 22:11:11.600348 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-h8dtr_6de60584-391f-413b-b341-8abcd770eb7d/init/0.log" Jan 30 22:11:11 crc kubenswrapper[4914]: I0130 22:11:11.913154 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-h8dtr_6de60584-391f-413b-b341-8abcd770eb7d/dnsmasq-dns/0.log" Jan 30 22:11:11 crc kubenswrapper[4914]: I0130 22:11:11.960071 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-h8dtr_6de60584-391f-413b-b341-8abcd770eb7d/init/0.log" Jan 30 22:11:12 crc kubenswrapper[4914]: I0130 22:11:12.005376 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-bczr4_a8bcc4f1-23fa-40da-8a45-7c89b377e6d7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:12 crc kubenswrapper[4914]: I0130 22:11:12.262402 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc35b1ae-deb3-425f-86a2-530461b4a6f1/glance-log/0.log" Jan 30 22:11:12 crc kubenswrapper[4914]: I0130 22:11:12.297516 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc35b1ae-deb3-425f-86a2-530461b4a6f1/glance-httpd/0.log" Jan 30 22:11:12 crc kubenswrapper[4914]: I0130 22:11:12.656726 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_00a00054-eb1e-492b-854d-4ea3396983ef/glance-httpd/0.log" Jan 30 22:11:12 crc kubenswrapper[4914]: I0130 22:11:12.664141 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-kl2tj_d218c024-e43c-4d70-8e68-4d03d423d9ae/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:12 crc kubenswrapper[4914]: I0130 22:11:12.693998 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_00a00054-eb1e-492b-854d-4ea3396983ef/glance-log/0.log" Jan 30 22:11:12 crc kubenswrapper[4914]: I0130 22:11:12.975810 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-ks7d4_51e0c921-ba72-45cf-b9b3-1b28148761d4/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:13 crc kubenswrapper[4914]: I0130 22:11:13.044315 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_022f86fb-3379-48ae-8987-348212c3e28e/cloudkitty-proc/0.log" Jan 30 22:11:13 crc kubenswrapper[4914]: I0130 22:11:13.245856 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29496841-5k68x_1804d838-ccaf-46f1-a848-81790716a2f4/keystone-cron/0.log" Jan 30 22:11:13 crc kubenswrapper[4914]: I0130 22:11:13.259292 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6c7c6d9b88-rpslq_c8f39fc5-1811-4872-b99a-4ba212837d75/keystone-api/0.log" Jan 30 22:11:13 crc kubenswrapper[4914]: I0130 22:11:13.336727 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_9d788ea3-1370-4a64-aff1-d8e2af7c6f94/kube-state-metrics/0.log" Jan 30 22:11:13 crc kubenswrapper[4914]: I0130 22:11:13.564790 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-mlw2d_90248a7e-c99e-4777-8767-3694c7a5b588/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:13 crc kubenswrapper[4914]: I0130 22:11:13.994928 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-ns7cz_7f19f9c6-b274-40bd-9693-b26eb56bbe0a/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:14 crc kubenswrapper[4914]: I0130 22:11:14.001085 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-594584649-k6kdl_60c348bd-a7cd-4220-aad6-39e33a8b3649/neutron-httpd/0.log" Jan 30 22:11:14 crc kubenswrapper[4914]: I0130 22:11:14.077469 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-594584649-k6kdl_60c348bd-a7cd-4220-aad6-39e33a8b3649/neutron-api/0.log" Jan 30 22:11:14 crc kubenswrapper[4914]: I0130 22:11:14.590323 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_25d6bd81-138b-48c0-8a75-586bb6489321/nova-api-log/0.log" Jan 30 22:11:14 crc kubenswrapper[4914]: I0130 22:11:14.761698 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8c466dcf-34cd-44fc-9516-16b7ec5cb492/nova-cell0-conductor-conductor/0.log" Jan 30 22:11:14 crc kubenswrapper[4914]: I0130 22:11:14.858147 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_25d6bd81-138b-48c0-8a75-586bb6489321/nova-api-api/0.log" Jan 30 22:11:14 crc kubenswrapper[4914]: I0130 22:11:14.948184 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_95ce4f8e-21fa-41f4-a300-a1cff7594ce3/nova-cell1-conductor-conductor/0.log" Jan 30 22:11:15 crc kubenswrapper[4914]: I0130 22:11:15.054849 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_60990201-067f-48d5-8a61-1a117609cfc7/nova-cell1-novncproxy-novncproxy/0.log" Jan 30 22:11:15 crc kubenswrapper[4914]: I0130 22:11:15.313445 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-zmw7z_9490e581-cf4d-4139-a77f-5f2b790ea96b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:15 crc kubenswrapper[4914]: I0130 22:11:15.452318 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0a4d1d85-ceb5-43da-85f2-a3b8d39590ab/nova-metadata-log/0.log" Jan 30 22:11:15 crc kubenswrapper[4914]: I0130 22:11:15.791531 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_1cf3c517-6ee1-4af1-a62b-bf572596a05a/nova-scheduler-scheduler/0.log" Jan 30 22:11:15 crc kubenswrapper[4914]: I0130 22:11:15.882308 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_da3bc7da-e810-4d0a-a7df-792c544f3a23/mysql-bootstrap/0.log" Jan 30 22:11:16 crc kubenswrapper[4914]: I0130 22:11:16.196012 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_da3bc7da-e810-4d0a-a7df-792c544f3a23/galera/0.log" Jan 30 22:11:16 crc kubenswrapper[4914]: I0130 22:11:16.248830 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_da3bc7da-e810-4d0a-a7df-792c544f3a23/mysql-bootstrap/0.log" Jan 30 22:11:16 crc kubenswrapper[4914]: I0130 22:11:16.433133 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_63625a35-5028-4dda-b9b3-ec3910fd8385/mysql-bootstrap/0.log" Jan 30 22:11:16 crc kubenswrapper[4914]: I0130 22:11:16.551140 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0a4d1d85-ceb5-43da-85f2-a3b8d39590ab/nova-metadata-metadata/0.log" Jan 30 22:11:16 crc kubenswrapper[4914]: I0130 22:11:16.700403 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_63625a35-5028-4dda-b9b3-ec3910fd8385/mysql-bootstrap/0.log" Jan 30 22:11:16 crc kubenswrapper[4914]: I0130 22:11:16.740737 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_63625a35-5028-4dda-b9b3-ec3910fd8385/galera/0.log" Jan 30 22:11:16 crc kubenswrapper[4914]: I0130 22:11:16.788069 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ce7a9cdf-dbb2-4055-a31e-b0fb2771bfed/openstackclient/0.log" Jan 30 22:11:17 crc kubenswrapper[4914]: I0130 22:11:17.078750 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-68th2_c819be77-2b86-4bbf-9e4b-f9738f59032d/openstack-network-exporter/0.log" Jan 30 22:11:17 crc kubenswrapper[4914]: I0130 22:11:17.229195 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kv2g9_11cefee1-f5e9-4f79-b25b-8dae49655475/ovsdb-server-init/0.log" Jan 30 22:11:17 crc kubenswrapper[4914]: I0130 22:11:17.392661 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kv2g9_11cefee1-f5e9-4f79-b25b-8dae49655475/ovsdb-server/0.log" Jan 30 22:11:17 crc kubenswrapper[4914]: I0130 22:11:17.405267 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kv2g9_11cefee1-f5e9-4f79-b25b-8dae49655475/ovs-vswitchd/0.log" Jan 30 22:11:17 crc kubenswrapper[4914]: I0130 22:11:17.432531 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kv2g9_11cefee1-f5e9-4f79-b25b-8dae49655475/ovsdb-server-init/0.log" Jan 30 22:11:17 crc kubenswrapper[4914]: I0130 22:11:17.647732 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rdzm9_3f063a16-987d-4378-b889-966755034c3e/ovn-controller/0.log" Jan 30 22:11:17 crc kubenswrapper[4914]: I0130 22:11:17.860521 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7nn89_1d553e8c-3252-4e0b-87f6-8e649c83f3de/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:17 crc kubenswrapper[4914]: I0130 22:11:17.943677 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_91e45099-57bd-49e7-aa99-5e11b711ec92/openstack-network-exporter/0.log" Jan 30 22:11:17 crc kubenswrapper[4914]: I0130 22:11:17.969275 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_91e45099-57bd-49e7-aa99-5e11b711ec92/ovn-northd/0.log" Jan 30 22:11:18 crc kubenswrapper[4914]: I0130 22:11:18.108863 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_abe9f42c-7055-4099-ad8e-f827973007cd/openstack-network-exporter/0.log" Jan 30 22:11:18 crc kubenswrapper[4914]: I0130 22:11:18.257379 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_abe9f42c-7055-4099-ad8e-f827973007cd/ovsdbserver-nb/0.log" Jan 30 22:11:18 crc kubenswrapper[4914]: I0130 22:11:18.396901 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_555d8330-2863-4fe8-96b8-2a751de6569d/openstack-network-exporter/0.log" Jan 30 22:11:18 crc kubenswrapper[4914]: I0130 22:11:18.458733 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_555d8330-2863-4fe8-96b8-2a751de6569d/ovsdbserver-sb/0.log" Jan 30 22:11:18 crc kubenswrapper[4914]: I0130 22:11:18.672630 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7bf9bcb7dd-cl45x_d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a/placement-api/0.log" Jan 30 22:11:18 crc kubenswrapper[4914]: I0130 22:11:18.737550 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7bf9bcb7dd-cl45x_d54b2e31-fb6a-4ab1-b680-8aeb4a663b3a/placement-log/0.log" Jan 30 22:11:18 crc kubenswrapper[4914]: I0130 22:11:18.872767 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6019a332-1bf4-40c9-9ed7-6956d8532e9c/init-config-reloader/0.log" Jan 30 22:11:19 crc kubenswrapper[4914]: I0130 22:11:19.026518 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6019a332-1bf4-40c9-9ed7-6956d8532e9c/config-reloader/0.log" Jan 30 22:11:19 crc kubenswrapper[4914]: I0130 22:11:19.079434 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6019a332-1bf4-40c9-9ed7-6956d8532e9c/init-config-reloader/0.log" Jan 30 22:11:19 crc kubenswrapper[4914]: I0130 22:11:19.154347 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6019a332-1bf4-40c9-9ed7-6956d8532e9c/prometheus/0.log" Jan 30 22:11:19 crc kubenswrapper[4914]: I0130 22:11:19.170261 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6019a332-1bf4-40c9-9ed7-6956d8532e9c/thanos-sidecar/0.log" Jan 30 22:11:19 crc kubenswrapper[4914]: I0130 22:11:19.335341 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_149d6b1c-dd4d-4433-906d-6774aeb77afb/setup-container/0.log" Jan 30 22:11:19 crc kubenswrapper[4914]: I0130 22:11:19.633764 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_149d6b1c-dd4d-4433-906d-6774aeb77afb/rabbitmq/0.log" Jan 30 22:11:19 crc kubenswrapper[4914]: I0130 22:11:19.691072 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_149d6b1c-dd4d-4433-906d-6774aeb77afb/setup-container/0.log" Jan 30 22:11:19 crc kubenswrapper[4914]: I0130 22:11:19.799344 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bc011821-8710-499b-8547-4ab18c9e2592/setup-container/0.log" Jan 30 22:11:19 crc kubenswrapper[4914]: I0130 22:11:19.954198 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bc011821-8710-499b-8547-4ab18c9e2592/setup-container/0.log" Jan 30 22:11:19 crc kubenswrapper[4914]: I0130 22:11:19.960257 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bc011821-8710-499b-8547-4ab18c9e2592/rabbitmq/0.log" Jan 30 22:11:20 crc kubenswrapper[4914]: I0130 22:11:20.015667 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-4ncwq_36499faa-28d4-4710-9190-125a3f1561a8/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:20 crc kubenswrapper[4914]: I0130 22:11:20.303146 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-jvkxv_672f23d1-408d-4b3e-9068-66faf28b06bb/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:20 crc kubenswrapper[4914]: I0130 22:11:20.414820 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-xr9lj_59c442fc-77b4-430b-8522-86705c6f7d3c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:20 crc kubenswrapper[4914]: I0130 22:11:20.708911 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-z2rh5_f8711449-6cb3-4b5e-9b13-618cb27f35dc/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:20 crc kubenswrapper[4914]: I0130 22:11:20.755149 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-qrk98_289a7e75-7ac0-4b40-9080-82dd58f5c81d/ssh-known-hosts-edpm-deployment/0.log" Jan 30 22:11:20 crc kubenswrapper[4914]: I0130 22:11:20.817986 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:11:20 crc kubenswrapper[4914]: E0130 22:11:20.818446 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:11:21 crc kubenswrapper[4914]: I0130 22:11:21.005241 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-79c4f49899-bc7gl_0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2/proxy-server/0.log" Jan 30 22:11:21 crc kubenswrapper[4914]: I0130 22:11:21.143295 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-79c4f49899-bc7gl_0ba4c264-b7bb-4e62-aa3b-a5220bfbb7a2/proxy-httpd/0.log" Jan 30 22:11:21 crc kubenswrapper[4914]: I0130 22:11:21.192689 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-4n6nd_73c592d5-ad34-4357-b3a8-ecc0567e1e8d/swift-ring-rebalance/0.log" Jan 30 22:11:21 crc kubenswrapper[4914]: I0130 22:11:21.396604 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/account-auditor/0.log" Jan 30 22:11:21 crc kubenswrapper[4914]: I0130 22:11:21.410170 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/account-reaper/0.log" Jan 30 22:11:21 crc kubenswrapper[4914]: I0130 22:11:21.583113 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/account-replicator/0.log" Jan 30 22:11:21 crc kubenswrapper[4914]: I0130 22:11:21.675061 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/container-auditor/0.log" Jan 30 22:11:21 crc kubenswrapper[4914]: I0130 22:11:21.738312 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/account-server/0.log" Jan 30 22:11:21 crc kubenswrapper[4914]: I0130 22:11:21.841161 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/container-replicator/0.log" Jan 30 22:11:21 crc kubenswrapper[4914]: I0130 22:11:21.931592 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/container-server/0.log" Jan 30 22:11:21 crc kubenswrapper[4914]: I0130 22:11:21.944908 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/container-updater/0.log" Jan 30 22:11:22 crc kubenswrapper[4914]: I0130 22:11:22.006349 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/object-auditor/0.log" Jan 30 22:11:22 crc kubenswrapper[4914]: I0130 22:11:22.059293 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/object-expirer/0.log" Jan 30 22:11:22 crc kubenswrapper[4914]: I0130 22:11:22.190018 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/object-server/0.log" Jan 30 22:11:22 crc kubenswrapper[4914]: I0130 22:11:22.197919 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/object-replicator/0.log" Jan 30 22:11:22 crc kubenswrapper[4914]: I0130 22:11:22.208810 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/object-updater/0.log" Jan 30 22:11:22 crc kubenswrapper[4914]: I0130 22:11:22.294437 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/rsync/0.log" Jan 30 22:11:22 crc kubenswrapper[4914]: I0130 22:11:22.426834 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a754950-b587-4c0a-85ed-e9669582ea2c/swift-recon-cron/0.log" Jan 30 22:11:22 crc kubenswrapper[4914]: I0130 22:11:22.627561 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-g9gkg_294df817-0302-46c7-84cf-f300a188d47a/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:22 crc kubenswrapper[4914]: I0130 22:11:22.750340 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_3aa1ebbf-1b7b-416b-8fe7-45c318d1d0a0/tempest-tests-tempest-tests-runner/0.log" Jan 30 22:11:22 crc kubenswrapper[4914]: I0130 22:11:22.912269 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_02341028-de3c-48cc-af26-413ff79477a0/test-operator-logs-container/0.log" Jan 30 22:11:23 crc kubenswrapper[4914]: I0130 22:11:23.067680 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-7n7t8_7d4c7903-33ee-499b-abb7-4e029ec9f925/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:11:27 crc kubenswrapper[4914]: I0130 22:11:27.649062 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_1dbbcbee-a7d4-4638-9d20-dbeda6ccdde0/memcached/0.log" Jan 30 22:11:33 crc kubenswrapper[4914]: I0130 22:11:33.819198 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:11:34 crc kubenswrapper[4914]: I0130 22:11:34.335234 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"25c099c982ea12314f9b5510223aeafb2aa3a30f2bbddfee53e26959ce1db558"} Jan 30 22:11:54 crc kubenswrapper[4914]: I0130 22:11:54.001274 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-9jf6b_8c596723-7c41-448d-831e-07fa9d1129e9/manager/0.log" Jan 30 22:11:54 crc kubenswrapper[4914]: I0130 22:11:54.383595 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd_9691502f-3160-40b6-9f3c-21a2545b14ac/util/0.log" Jan 30 22:11:54 crc kubenswrapper[4914]: I0130 22:11:54.546181 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd_9691502f-3160-40b6-9f3c-21a2545b14ac/pull/0.log" Jan 30 22:11:54 crc kubenswrapper[4914]: I0130 22:11:54.547646 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd_9691502f-3160-40b6-9f3c-21a2545b14ac/util/0.log" Jan 30 22:11:54 crc kubenswrapper[4914]: I0130 22:11:54.577422 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd_9691502f-3160-40b6-9f3c-21a2545b14ac/pull/0.log" Jan 30 22:11:54 crc kubenswrapper[4914]: I0130 22:11:54.793245 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd_9691502f-3160-40b6-9f3c-21a2545b14ac/extract/0.log" Jan 30 22:11:54 crc kubenswrapper[4914]: I0130 22:11:54.803895 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd_9691502f-3160-40b6-9f3c-21a2545b14ac/util/0.log" Jan 30 22:11:54 crc kubenswrapper[4914]: I0130 22:11:54.834611 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcknznd_9691502f-3160-40b6-9f3c-21a2545b14ac/pull/0.log" Jan 30 22:11:54 crc kubenswrapper[4914]: I0130 22:11:54.983863 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-54hzc_7fa005ac-0d9e-4784-8558-df96b2d54006/manager/0.log" Jan 30 22:11:55 crc kubenswrapper[4914]: I0130 22:11:55.010876 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-t225p_9a7f7899-f35e-4fef-ba51-82af970498db/manager/0.log" Jan 30 22:11:55 crc kubenswrapper[4914]: I0130 22:11:55.272020 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-hrq78_c6d2cebc-7c79-407e-8f69-6b93ab2b41b7/manager/0.log" Jan 30 22:11:55 crc kubenswrapper[4914]: I0130 22:11:55.380743 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-qz2j7_3330ea8a-466e-4ba5-ad5e-dcb7859521b0/manager/0.log" Jan 30 22:11:55 crc kubenswrapper[4914]: I0130 22:11:55.636007 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-vvdgj_1746db5a-3b9a-4d76-b1f3-845b907ccabc/manager/0.log" Jan 30 22:11:55 crc kubenswrapper[4914]: I0130 22:11:55.902004 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-bmgvt_91003d71-490f-458c-94e9-d8957d6eaac9/manager/0.log" Jan 30 22:11:55 crc kubenswrapper[4914]: I0130 22:11:55.933603 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-52vwl_f96847c8-b695-44a5-8756-b2fe0da5e409/manager/0.log" Jan 30 22:11:56 crc kubenswrapper[4914]: I0130 22:11:56.087778 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-qc9x5_8419dc35-995b-43a3-82b7-6c2b7eb66d35/manager/0.log" Jan 30 22:11:56 crc kubenswrapper[4914]: I0130 22:11:56.121275 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-648g4_0f009ea3-601c-4c2c-bbf4-d300abfe1100/manager/0.log" Jan 30 22:11:56 crc kubenswrapper[4914]: I0130 22:11:56.463182 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-kjw6v_aa6ad36a-4244-490c-970b-52b03b3c3821/manager/0.log" Jan 30 22:11:56 crc kubenswrapper[4914]: I0130 22:11:56.593752 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-wg7x6_d505f587-6893-495d-99a0-acbdee4442df/manager/0.log" Jan 30 22:11:56 crc kubenswrapper[4914]: I0130 22:11:56.920410 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-lxv78_71273dcf-3d97-4a71-9b09-d8261da90f73/manager/0.log" Jan 30 22:11:57 crc kubenswrapper[4914]: I0130 22:11:57.003654 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-td9qq_54b1be48-0cbc-4e1a-be5a-bc6b4cf5df27/manager/0.log" Jan 30 22:11:57 crc kubenswrapper[4914]: I0130 22:11:57.134276 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-59c4b45c4d5hgfj_1d5fa522-5cf7-420f-be41-1d55eb8f1b2c/manager/0.log" Jan 30 22:11:57 crc kubenswrapper[4914]: I0130 22:11:57.341621 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-55fdcd6c79-z5tf8_d17e19a0-3c34-44d5-8684-f78694dcb2ce/operator/0.log" Jan 30 22:11:57 crc kubenswrapper[4914]: I0130 22:11:57.625428 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-fqj8p_a7dd7e37-8bb3-44c5-982e-021408582fc6/registry-server/0.log" Jan 30 22:11:57 crc kubenswrapper[4914]: I0130 22:11:57.807477 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-t8sfg_d9414032-0155-4d4b-b456-02ee0f3f4185/manager/0.log" Jan 30 22:11:58 crc kubenswrapper[4914]: I0130 22:11:58.260824 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-wknxv_f23972f8-fa47-444a-b26a-f02086d4f186/manager/0.log" Jan 30 22:11:58 crc kubenswrapper[4914]: I0130 22:11:58.482095 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5l8kc_31234f15-4801-4840-95e7-e985b1d80aa5/operator/0.log" Jan 30 22:11:58 crc kubenswrapper[4914]: I0130 22:11:58.570678 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-gltkz_ccefb276-816b-481e-a645-2bc8b4619d7c/manager/0.log" Jan 30 22:11:58 crc kubenswrapper[4914]: I0130 22:11:58.743392 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7d48698d88-mch9s_f2909dee-0316-4626-b532-ebdd66466638/manager/0.log" Jan 30 22:11:58 crc kubenswrapper[4914]: I0130 22:11:58.812850 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-zdp5q_eddacba3-772b-4f09-acfe-60f6c56ba39c/manager/0.log" Jan 30 22:11:59 crc kubenswrapper[4914]: I0130 22:11:59.052423 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-th2ws_8a629cdc-714c-442c-90ae-d20b15d257c6/manager/0.log" Jan 30 22:11:59 crc kubenswrapper[4914]: I0130 22:11:59.187018 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6749767b8f-rdc4j_b8e1305a-5b6a-45f3-a228-b16259431de5/manager/0.log" Jan 30 22:12:22 crc kubenswrapper[4914]: I0130 22:12:22.368118 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-p49xt_e3882a6f-456e-4016-b6b1-76a916735c3b/control-plane-machine-set-operator/0.log" Jan 30 22:12:22 crc kubenswrapper[4914]: I0130 22:12:22.616397 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-85rbp_6b3718ea-66f6-4f01-97c5-94c7c844e1a0/kube-rbac-proxy/0.log" Jan 30 22:12:22 crc kubenswrapper[4914]: I0130 22:12:22.622599 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-85rbp_6b3718ea-66f6-4f01-97c5-94c7c844e1a0/machine-api-operator/0.log" Jan 30 22:12:38 crc kubenswrapper[4914]: I0130 22:12:38.109371 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-x54cc_5bc05e58-c6ae-4998-9ea7-1f60ec131e48/cert-manager-controller/0.log" Jan 30 22:12:38 crc kubenswrapper[4914]: I0130 22:12:38.439738 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-s2t9k_a22759e4-6040-4a43-affd-a278ced19421/cert-manager-cainjector/0.log" Jan 30 22:12:38 crc kubenswrapper[4914]: I0130 22:12:38.583386 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-fzp8x_814783bd-aa98-42ce-9cbe-8afdaa508449/cert-manager-webhook/0.log" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.240828 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d4cmk"] Jan 30 22:12:53 crc kubenswrapper[4914]: E0130 22:12:53.241878 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398008a3-a324-4e81-9509-9bfe168aaf48" containerName="extract-utilities" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.241894 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="398008a3-a324-4e81-9509-9bfe168aaf48" containerName="extract-utilities" Jan 30 22:12:53 crc kubenswrapper[4914]: E0130 22:12:53.241910 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7acfda-c7d9-4c01-a780-acbb6f88c418" containerName="container-00" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.241918 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7acfda-c7d9-4c01-a780-acbb6f88c418" containerName="container-00" Jan 30 22:12:53 crc kubenswrapper[4914]: E0130 22:12:53.241965 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398008a3-a324-4e81-9509-9bfe168aaf48" containerName="registry-server" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.241971 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="398008a3-a324-4e81-9509-9bfe168aaf48" containerName="registry-server" Jan 30 22:12:53 crc kubenswrapper[4914]: E0130 22:12:53.241983 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398008a3-a324-4e81-9509-9bfe168aaf48" containerName="extract-content" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.241989 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="398008a3-a324-4e81-9509-9bfe168aaf48" containerName="extract-content" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.242202 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="398008a3-a324-4e81-9509-9bfe168aaf48" containerName="registry-server" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.242222 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a7acfda-c7d9-4c01-a780-acbb6f88c418" containerName="container-00" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.244192 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.270033 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d4cmk"] Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.408429 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/562f7775-6d1d-4768-b1f8-38bbaff02906-utilities\") pod \"redhat-operators-d4cmk\" (UID: \"562f7775-6d1d-4768-b1f8-38bbaff02906\") " pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.408573 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm2nr\" (UniqueName: \"kubernetes.io/projected/562f7775-6d1d-4768-b1f8-38bbaff02906-kube-api-access-zm2nr\") pod \"redhat-operators-d4cmk\" (UID: \"562f7775-6d1d-4768-b1f8-38bbaff02906\") " pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.408595 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/562f7775-6d1d-4768-b1f8-38bbaff02906-catalog-content\") pod \"redhat-operators-d4cmk\" (UID: \"562f7775-6d1d-4768-b1f8-38bbaff02906\") " pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.510280 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm2nr\" (UniqueName: \"kubernetes.io/projected/562f7775-6d1d-4768-b1f8-38bbaff02906-kube-api-access-zm2nr\") pod \"redhat-operators-d4cmk\" (UID: \"562f7775-6d1d-4768-b1f8-38bbaff02906\") " pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.510343 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/562f7775-6d1d-4768-b1f8-38bbaff02906-catalog-content\") pod \"redhat-operators-d4cmk\" (UID: \"562f7775-6d1d-4768-b1f8-38bbaff02906\") " pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.510445 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/562f7775-6d1d-4768-b1f8-38bbaff02906-utilities\") pod \"redhat-operators-d4cmk\" (UID: \"562f7775-6d1d-4768-b1f8-38bbaff02906\") " pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.511083 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/562f7775-6d1d-4768-b1f8-38bbaff02906-utilities\") pod \"redhat-operators-d4cmk\" (UID: \"562f7775-6d1d-4768-b1f8-38bbaff02906\") " pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.511205 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/562f7775-6d1d-4768-b1f8-38bbaff02906-catalog-content\") pod \"redhat-operators-d4cmk\" (UID: \"562f7775-6d1d-4768-b1f8-38bbaff02906\") " pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.529864 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm2nr\" (UniqueName: \"kubernetes.io/projected/562f7775-6d1d-4768-b1f8-38bbaff02906-kube-api-access-zm2nr\") pod \"redhat-operators-d4cmk\" (UID: \"562f7775-6d1d-4768-b1f8-38bbaff02906\") " pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.609566 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:12:53 crc kubenswrapper[4914]: I0130 22:12:53.918071 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-29k7h_ce26ef88-3c09-4fe1-bb28-56fceef865fb/nmstate-console-plugin/0.log" Jan 30 22:12:54 crc kubenswrapper[4914]: I0130 22:12:54.184284 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d4cmk"] Jan 30 22:12:54 crc kubenswrapper[4914]: I0130 22:12:54.200427 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-njdw4_a054e0a7-fe48-4adc-b216-d386c6ecd958/nmstate-handler/0.log" Jan 30 22:12:54 crc kubenswrapper[4914]: I0130 22:12:54.223587 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-vrwjg_5e8e35e1-e28f-47ac-b6d2-d51a7da04d2d/kube-rbac-proxy/0.log" Jan 30 22:12:54 crc kubenswrapper[4914]: I0130 22:12:54.481993 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-vrwjg_5e8e35e1-e28f-47ac-b6d2-d51a7da04d2d/nmstate-metrics/0.log" Jan 30 22:12:54 crc kubenswrapper[4914]: I0130 22:12:54.773051 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-9rpjx_48287b34-9e24-45ef-b31a-d8e32f405068/nmstate-operator/0.log" Jan 30 22:12:54 crc kubenswrapper[4914]: I0130 22:12:54.851459 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-lfjgh_167400a6-ae93-41a2-a825-ba7bd5984a12/nmstate-webhook/0.log" Jan 30 22:12:55 crc kubenswrapper[4914]: I0130 22:12:55.144555 4914 generic.go:334] "Generic (PLEG): container finished" podID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerID="c1631801878ce23639de3a89a1b1c6ef768bc7f87f7ae7d3424a128e497da677" exitCode=0 Jan 30 22:12:55 crc kubenswrapper[4914]: I0130 22:12:55.144666 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4cmk" event={"ID":"562f7775-6d1d-4768-b1f8-38bbaff02906","Type":"ContainerDied","Data":"c1631801878ce23639de3a89a1b1c6ef768bc7f87f7ae7d3424a128e497da677"} Jan 30 22:12:55 crc kubenswrapper[4914]: I0130 22:12:55.144913 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4cmk" event={"ID":"562f7775-6d1d-4768-b1f8-38bbaff02906","Type":"ContainerStarted","Data":"e55ed798f61218d87b0abc16dcf91e01116459e88b3c3b99331d9cf6970d07d4"} Jan 30 22:12:56 crc kubenswrapper[4914]: I0130 22:12:56.155455 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4cmk" event={"ID":"562f7775-6d1d-4768-b1f8-38bbaff02906","Type":"ContainerStarted","Data":"a58650dc11051c122629ae9f8bcdb72a1eca7bd64a1fe6beed8129c6c33f5821"} Jan 30 22:13:07 crc kubenswrapper[4914]: I0130 22:13:07.258045 4914 generic.go:334] "Generic (PLEG): container finished" podID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerID="a58650dc11051c122629ae9f8bcdb72a1eca7bd64a1fe6beed8129c6c33f5821" exitCode=0 Jan 30 22:13:07 crc kubenswrapper[4914]: I0130 22:13:07.258601 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4cmk" event={"ID":"562f7775-6d1d-4768-b1f8-38bbaff02906","Type":"ContainerDied","Data":"a58650dc11051c122629ae9f8bcdb72a1eca7bd64a1fe6beed8129c6c33f5821"} Jan 30 22:13:07 crc kubenswrapper[4914]: I0130 22:13:07.261289 4914 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.080314 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vqdgb"] Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.083000 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.103489 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqdgb"] Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.232967 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43fab7f-d6cf-4905-b4fa-9791d435bd79-utilities\") pod \"certified-operators-vqdgb\" (UID: \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\") " pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.233012 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbbcv\" (UniqueName: \"kubernetes.io/projected/f43fab7f-d6cf-4905-b4fa-9791d435bd79-kube-api-access-xbbcv\") pod \"certified-operators-vqdgb\" (UID: \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\") " pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.233064 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43fab7f-d6cf-4905-b4fa-9791d435bd79-catalog-content\") pod \"certified-operators-vqdgb\" (UID: \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\") " pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.291606 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4cmk" event={"ID":"562f7775-6d1d-4768-b1f8-38bbaff02906","Type":"ContainerStarted","Data":"db5d8e770b3f0a4fb7f4c0372d09c0b95d1e33f16fb592a615ff4385f8a851e1"} Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.318833 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d4cmk" podStartSLOduration=2.804604897 podStartE2EDuration="15.318812432s" podCreationTimestamp="2026-01-30 22:12:53 +0000 UTC" firstStartedPulling="2026-01-30 22:12:55.146903377 +0000 UTC m=+3508.585540128" lastFinishedPulling="2026-01-30 22:13:07.661110892 +0000 UTC m=+3521.099747663" observedRunningTime="2026-01-30 22:13:08.313092611 +0000 UTC m=+3521.751729372" watchObservedRunningTime="2026-01-30 22:13:08.318812432 +0000 UTC m=+3521.757449193" Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.335717 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43fab7f-d6cf-4905-b4fa-9791d435bd79-utilities\") pod \"certified-operators-vqdgb\" (UID: \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\") " pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.335774 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbbcv\" (UniqueName: \"kubernetes.io/projected/f43fab7f-d6cf-4905-b4fa-9791d435bd79-kube-api-access-xbbcv\") pod \"certified-operators-vqdgb\" (UID: \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\") " pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.335835 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43fab7f-d6cf-4905-b4fa-9791d435bd79-catalog-content\") pod \"certified-operators-vqdgb\" (UID: \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\") " pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.336287 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43fab7f-d6cf-4905-b4fa-9791d435bd79-utilities\") pod \"certified-operators-vqdgb\" (UID: \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\") " pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.336348 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43fab7f-d6cf-4905-b4fa-9791d435bd79-catalog-content\") pod \"certified-operators-vqdgb\" (UID: \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\") " pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.360217 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbbcv\" (UniqueName: \"kubernetes.io/projected/f43fab7f-d6cf-4905-b4fa-9791d435bd79-kube-api-access-xbbcv\") pod \"certified-operators-vqdgb\" (UID: \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\") " pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:08 crc kubenswrapper[4914]: I0130 22:13:08.407108 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.006166 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqdgb"] Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.301733 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqdgb" event={"ID":"f43fab7f-d6cf-4905-b4fa-9791d435bd79","Type":"ContainerStarted","Data":"8ccfcfe0e43b311e3406d84089149f2ab3c95805dc936fa0babccd65b65e7142"} Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.487745 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7ckkb"] Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.490549 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.501649 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ckkb"] Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.566592 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-utilities\") pod \"redhat-marketplace-7ckkb\" (UID: \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\") " pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.566774 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2x4k\" (UniqueName: \"kubernetes.io/projected/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-kube-api-access-d2x4k\") pod \"redhat-marketplace-7ckkb\" (UID: \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\") " pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.566923 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-catalog-content\") pod \"redhat-marketplace-7ckkb\" (UID: \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\") " pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.668662 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2x4k\" (UniqueName: \"kubernetes.io/projected/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-kube-api-access-d2x4k\") pod \"redhat-marketplace-7ckkb\" (UID: \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\") " pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.668807 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-catalog-content\") pod \"redhat-marketplace-7ckkb\" (UID: \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\") " pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.668954 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-utilities\") pod \"redhat-marketplace-7ckkb\" (UID: \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\") " pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.669518 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-utilities\") pod \"redhat-marketplace-7ckkb\" (UID: \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\") " pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.670161 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-catalog-content\") pod \"redhat-marketplace-7ckkb\" (UID: \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\") " pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.704017 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2x4k\" (UniqueName: \"kubernetes.io/projected/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-kube-api-access-d2x4k\") pod \"redhat-marketplace-7ckkb\" (UID: \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\") " pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:09 crc kubenswrapper[4914]: I0130 22:13:09.813551 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:10 crc kubenswrapper[4914]: I0130 22:13:10.353408 4914 generic.go:334] "Generic (PLEG): container finished" podID="f43fab7f-d6cf-4905-b4fa-9791d435bd79" containerID="e2e89d499f2cf3807e559941ff310313ce4b48197428933e7bbd2ef3a7afbc17" exitCode=0 Jan 30 22:13:10 crc kubenswrapper[4914]: I0130 22:13:10.355796 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqdgb" event={"ID":"f43fab7f-d6cf-4905-b4fa-9791d435bd79","Type":"ContainerDied","Data":"e2e89d499f2cf3807e559941ff310313ce4b48197428933e7bbd2ef3a7afbc17"} Jan 30 22:13:10 crc kubenswrapper[4914]: I0130 22:13:10.501624 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ckkb"] Jan 30 22:13:11 crc kubenswrapper[4914]: I0130 22:13:11.368434 4914 generic.go:334] "Generic (PLEG): container finished" podID="0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" containerID="3dcdedd7a252f28e184edc7e2c94a23c384d032d7351cd57b145a7d038b6c587" exitCode=0 Jan 30 22:13:11 crc kubenswrapper[4914]: I0130 22:13:11.368493 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ckkb" event={"ID":"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8","Type":"ContainerDied","Data":"3dcdedd7a252f28e184edc7e2c94a23c384d032d7351cd57b145a7d038b6c587"} Jan 30 22:13:11 crc kubenswrapper[4914]: I0130 22:13:11.368741 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ckkb" event={"ID":"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8","Type":"ContainerStarted","Data":"47b77bfe34e44cdb385c567cba229a05de639d6efcae032c602e71a8280fddd2"} Jan 30 22:13:12 crc kubenswrapper[4914]: I0130 22:13:12.396184 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqdgb" event={"ID":"f43fab7f-d6cf-4905-b4fa-9791d435bd79","Type":"ContainerStarted","Data":"61cfdfab9fa529b551d9e158f12c2f23c1d64942e381c9555d9bdd6984fdce2c"} Jan 30 22:13:12 crc kubenswrapper[4914]: I0130 22:13:12.621662 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6cc9c48657-sbpc5_863c64b0-0be9-464d-973a-2bbfc89a6ff0/kube-rbac-proxy/0.log" Jan 30 22:13:12 crc kubenswrapper[4914]: I0130 22:13:12.693470 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6cc9c48657-sbpc5_863c64b0-0be9-464d-973a-2bbfc89a6ff0/manager/0.log" Jan 30 22:13:13 crc kubenswrapper[4914]: I0130 22:13:13.411066 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ckkb" event={"ID":"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8","Type":"ContainerStarted","Data":"ba8e5cc6e0eb50c281c0c6c5d3d2548b61dab0202250d2b77bdee3faa66d8946"} Jan 30 22:13:13 crc kubenswrapper[4914]: I0130 22:13:13.610113 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:13:13 crc kubenswrapper[4914]: I0130 22:13:13.610173 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:13:14 crc kubenswrapper[4914]: I0130 22:13:14.667714 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d4cmk" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="registry-server" probeResult="failure" output=< Jan 30 22:13:14 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:13:14 crc kubenswrapper[4914]: > Jan 30 22:13:18 crc kubenswrapper[4914]: I0130 22:13:18.481442 4914 generic.go:334] "Generic (PLEG): container finished" podID="0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" containerID="ba8e5cc6e0eb50c281c0c6c5d3d2548b61dab0202250d2b77bdee3faa66d8946" exitCode=0 Jan 30 22:13:18 crc kubenswrapper[4914]: I0130 22:13:18.481531 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ckkb" event={"ID":"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8","Type":"ContainerDied","Data":"ba8e5cc6e0eb50c281c0c6c5d3d2548b61dab0202250d2b77bdee3faa66d8946"} Jan 30 22:13:18 crc kubenswrapper[4914]: I0130 22:13:18.484321 4914 generic.go:334] "Generic (PLEG): container finished" podID="f43fab7f-d6cf-4905-b4fa-9791d435bd79" containerID="61cfdfab9fa529b551d9e158f12c2f23c1d64942e381c9555d9bdd6984fdce2c" exitCode=0 Jan 30 22:13:18 crc kubenswrapper[4914]: I0130 22:13:18.484347 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqdgb" event={"ID":"f43fab7f-d6cf-4905-b4fa-9791d435bd79","Type":"ContainerDied","Data":"61cfdfab9fa529b551d9e158f12c2f23c1d64942e381c9555d9bdd6984fdce2c"} Jan 30 22:13:19 crc kubenswrapper[4914]: I0130 22:13:19.519951 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ckkb" event={"ID":"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8","Type":"ContainerStarted","Data":"9cb0304495f0a4c282f5003a8a372e66fc7872120f903ee529495b8fcb326f1f"} Jan 30 22:13:19 crc kubenswrapper[4914]: I0130 22:13:19.523492 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqdgb" event={"ID":"f43fab7f-d6cf-4905-b4fa-9791d435bd79","Type":"ContainerStarted","Data":"cc87b29c814eabb9481a93e96ce6728d42af1f3fd9b165b538a1e674541f0bc0"} Jan 30 22:13:19 crc kubenswrapper[4914]: I0130 22:13:19.547964 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7ckkb" podStartSLOduration=2.740426033 podStartE2EDuration="10.547944583s" podCreationTimestamp="2026-01-30 22:13:09 +0000 UTC" firstStartedPulling="2026-01-30 22:13:11.370449267 +0000 UTC m=+3524.809086028" lastFinishedPulling="2026-01-30 22:13:19.177967817 +0000 UTC m=+3532.616604578" observedRunningTime="2026-01-30 22:13:19.536082871 +0000 UTC m=+3532.974719632" watchObservedRunningTime="2026-01-30 22:13:19.547944583 +0000 UTC m=+3532.986581344" Jan 30 22:13:19 crc kubenswrapper[4914]: I0130 22:13:19.565063 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vqdgb" podStartSLOduration=2.819373649 podStartE2EDuration="11.565034823s" podCreationTimestamp="2026-01-30 22:13:08 +0000 UTC" firstStartedPulling="2026-01-30 22:13:10.366874204 +0000 UTC m=+3523.805510965" lastFinishedPulling="2026-01-30 22:13:19.112535388 +0000 UTC m=+3532.551172139" observedRunningTime="2026-01-30 22:13:19.555385016 +0000 UTC m=+3532.994021777" watchObservedRunningTime="2026-01-30 22:13:19.565034823 +0000 UTC m=+3533.003671604" Jan 30 22:13:19 crc kubenswrapper[4914]: I0130 22:13:19.814559 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:19 crc kubenswrapper[4914]: I0130 22:13:19.814617 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:20 crc kubenswrapper[4914]: I0130 22:13:20.862041 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-7ckkb" podUID="0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" containerName="registry-server" probeResult="failure" output=< Jan 30 22:13:20 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:13:20 crc kubenswrapper[4914]: > Jan 30 22:13:24 crc kubenswrapper[4914]: I0130 22:13:24.661844 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d4cmk" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="registry-server" probeResult="failure" output=< Jan 30 22:13:24 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:13:24 crc kubenswrapper[4914]: > Jan 30 22:13:28 crc kubenswrapper[4914]: I0130 22:13:28.407743 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:28 crc kubenswrapper[4914]: I0130 22:13:28.408356 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:28 crc kubenswrapper[4914]: I0130 22:13:28.471976 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:28 crc kubenswrapper[4914]: I0130 22:13:28.558429 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-9752c_1633e963-9082-4659-af26-20bc3b1e512b/prometheus-operator/0.log" Jan 30 22:13:28 crc kubenswrapper[4914]: I0130 22:13:28.689569 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:28 crc kubenswrapper[4914]: I0130 22:13:28.764200 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqdgb"] Jan 30 22:13:28 crc kubenswrapper[4914]: I0130 22:13:28.911138 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae/prometheus-operator-admission-webhook/0.log" Jan 30 22:13:29 crc kubenswrapper[4914]: I0130 22:13:29.141898 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_128f28df-6fdd-4a2c-86af-6dfe33baf2c9/prometheus-operator-admission-webhook/0.log" Jan 30 22:13:29 crc kubenswrapper[4914]: I0130 22:13:29.254964 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-gf6nm_0f547924-f70d-41f4-8461-57953f81d9ac/operator/0.log" Jan 30 22:13:29 crc kubenswrapper[4914]: I0130 22:13:29.539922 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-6c2k7_1b8e78e3-d709-4289-b9aa-15a6270a66d0/perses-operator/0.log" Jan 30 22:13:29 crc kubenswrapper[4914]: I0130 22:13:29.861512 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:29 crc kubenswrapper[4914]: I0130 22:13:29.914106 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:30 crc kubenswrapper[4914]: I0130 22:13:30.639595 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vqdgb" podUID="f43fab7f-d6cf-4905-b4fa-9791d435bd79" containerName="registry-server" containerID="cri-o://cc87b29c814eabb9481a93e96ce6728d42af1f3fd9b165b538a1e674541f0bc0" gracePeriod=2 Jan 30 22:13:31 crc kubenswrapper[4914]: I0130 22:13:31.113581 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ckkb"] Jan 30 22:13:31 crc kubenswrapper[4914]: I0130 22:13:31.653487 4914 generic.go:334] "Generic (PLEG): container finished" podID="f43fab7f-d6cf-4905-b4fa-9791d435bd79" containerID="cc87b29c814eabb9481a93e96ce6728d42af1f3fd9b165b538a1e674541f0bc0" exitCode=0 Jan 30 22:13:31 crc kubenswrapper[4914]: I0130 22:13:31.653556 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqdgb" event={"ID":"f43fab7f-d6cf-4905-b4fa-9791d435bd79","Type":"ContainerDied","Data":"cc87b29c814eabb9481a93e96ce6728d42af1f3fd9b165b538a1e674541f0bc0"} Jan 30 22:13:31 crc kubenswrapper[4914]: I0130 22:13:31.655116 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7ckkb" podUID="0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" containerName="registry-server" containerID="cri-o://9cb0304495f0a4c282f5003a8a372e66fc7872120f903ee529495b8fcb326f1f" gracePeriod=2 Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.407861 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.502902 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43fab7f-d6cf-4905-b4fa-9791d435bd79-catalog-content\") pod \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\" (UID: \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\") " Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.503120 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43fab7f-d6cf-4905-b4fa-9791d435bd79-utilities\") pod \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\" (UID: \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\") " Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.503298 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbbcv\" (UniqueName: \"kubernetes.io/projected/f43fab7f-d6cf-4905-b4fa-9791d435bd79-kube-api-access-xbbcv\") pod \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\" (UID: \"f43fab7f-d6cf-4905-b4fa-9791d435bd79\") " Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.504734 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43fab7f-d6cf-4905-b4fa-9791d435bd79-utilities" (OuterVolumeSpecName: "utilities") pod "f43fab7f-d6cf-4905-b4fa-9791d435bd79" (UID: "f43fab7f-d6cf-4905-b4fa-9791d435bd79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.536264 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43fab7f-d6cf-4905-b4fa-9791d435bd79-kube-api-access-xbbcv" (OuterVolumeSpecName: "kube-api-access-xbbcv") pod "f43fab7f-d6cf-4905-b4fa-9791d435bd79" (UID: "f43fab7f-d6cf-4905-b4fa-9791d435bd79"). InnerVolumeSpecName "kube-api-access-xbbcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.568216 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43fab7f-d6cf-4905-b4fa-9791d435bd79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f43fab7f-d6cf-4905-b4fa-9791d435bd79" (UID: "f43fab7f-d6cf-4905-b4fa-9791d435bd79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.606384 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbbcv\" (UniqueName: \"kubernetes.io/projected/f43fab7f-d6cf-4905-b4fa-9791d435bd79-kube-api-access-xbbcv\") on node \"crc\" DevicePath \"\"" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.606433 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43fab7f-d6cf-4905-b4fa-9791d435bd79-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.606472 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43fab7f-d6cf-4905-b4fa-9791d435bd79-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.676014 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.679848 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqdgb" event={"ID":"f43fab7f-d6cf-4905-b4fa-9791d435bd79","Type":"ContainerDied","Data":"8ccfcfe0e43b311e3406d84089149f2ab3c95805dc936fa0babccd65b65e7142"} Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.679897 4914 scope.go:117] "RemoveContainer" containerID="cc87b29c814eabb9481a93e96ce6728d42af1f3fd9b165b538a1e674541f0bc0" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.680020 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqdgb" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.692004 4914 generic.go:334] "Generic (PLEG): container finished" podID="0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" containerID="9cb0304495f0a4c282f5003a8a372e66fc7872120f903ee529495b8fcb326f1f" exitCode=0 Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.692074 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ckkb" event={"ID":"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8","Type":"ContainerDied","Data":"9cb0304495f0a4c282f5003a8a372e66fc7872120f903ee529495b8fcb326f1f"} Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.692099 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ckkb" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.692118 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ckkb" event={"ID":"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8","Type":"ContainerDied","Data":"47b77bfe34e44cdb385c567cba229a05de639d6efcae032c602e71a8280fddd2"} Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.776547 4914 scope.go:117] "RemoveContainer" containerID="61cfdfab9fa529b551d9e158f12c2f23c1d64942e381c9555d9bdd6984fdce2c" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.779010 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-utilities\") pod \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\" (UID: \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\") " Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.779164 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-catalog-content\") pod \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\" (UID: \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\") " Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.779509 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2x4k\" (UniqueName: \"kubernetes.io/projected/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-kube-api-access-d2x4k\") pod \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\" (UID: \"0f3aae7a-21c8-4488-ae7c-3e73c065e8a8\") " Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.780522 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-utilities" (OuterVolumeSpecName: "utilities") pod "0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" (UID: "0f3aae7a-21c8-4488-ae7c-3e73c065e8a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.789449 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-kube-api-access-d2x4k" (OuterVolumeSpecName: "kube-api-access-d2x4k") pod "0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" (UID: "0f3aae7a-21c8-4488-ae7c-3e73c065e8a8"). InnerVolumeSpecName "kube-api-access-d2x4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.798314 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqdgb"] Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.804373 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" (UID: "0f3aae7a-21c8-4488-ae7c-3e73c065e8a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.816072 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vqdgb"] Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.830520 4914 scope.go:117] "RemoveContainer" containerID="e2e89d499f2cf3807e559941ff310313ce4b48197428933e7bbd2ef3a7afbc17" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.873155 4914 scope.go:117] "RemoveContainer" containerID="9cb0304495f0a4c282f5003a8a372e66fc7872120f903ee529495b8fcb326f1f" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.883338 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2x4k\" (UniqueName: \"kubernetes.io/projected/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-kube-api-access-d2x4k\") on node \"crc\" DevicePath \"\"" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.883390 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.883405 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.901826 4914 scope.go:117] "RemoveContainer" containerID="ba8e5cc6e0eb50c281c0c6c5d3d2548b61dab0202250d2b77bdee3faa66d8946" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.934279 4914 scope.go:117] "RemoveContainer" containerID="3dcdedd7a252f28e184edc7e2c94a23c384d032d7351cd57b145a7d038b6c587" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.952538 4914 scope.go:117] "RemoveContainer" containerID="9cb0304495f0a4c282f5003a8a372e66fc7872120f903ee529495b8fcb326f1f" Jan 30 22:13:32 crc kubenswrapper[4914]: E0130 22:13:32.953089 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cb0304495f0a4c282f5003a8a372e66fc7872120f903ee529495b8fcb326f1f\": container with ID starting with 9cb0304495f0a4c282f5003a8a372e66fc7872120f903ee529495b8fcb326f1f not found: ID does not exist" containerID="9cb0304495f0a4c282f5003a8a372e66fc7872120f903ee529495b8fcb326f1f" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.953143 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cb0304495f0a4c282f5003a8a372e66fc7872120f903ee529495b8fcb326f1f"} err="failed to get container status \"9cb0304495f0a4c282f5003a8a372e66fc7872120f903ee529495b8fcb326f1f\": rpc error: code = NotFound desc = could not find container \"9cb0304495f0a4c282f5003a8a372e66fc7872120f903ee529495b8fcb326f1f\": container with ID starting with 9cb0304495f0a4c282f5003a8a372e66fc7872120f903ee529495b8fcb326f1f not found: ID does not exist" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.953179 4914 scope.go:117] "RemoveContainer" containerID="ba8e5cc6e0eb50c281c0c6c5d3d2548b61dab0202250d2b77bdee3faa66d8946" Jan 30 22:13:32 crc kubenswrapper[4914]: E0130 22:13:32.953666 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba8e5cc6e0eb50c281c0c6c5d3d2548b61dab0202250d2b77bdee3faa66d8946\": container with ID starting with ba8e5cc6e0eb50c281c0c6c5d3d2548b61dab0202250d2b77bdee3faa66d8946 not found: ID does not exist" containerID="ba8e5cc6e0eb50c281c0c6c5d3d2548b61dab0202250d2b77bdee3faa66d8946" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.953699 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8e5cc6e0eb50c281c0c6c5d3d2548b61dab0202250d2b77bdee3faa66d8946"} err="failed to get container status \"ba8e5cc6e0eb50c281c0c6c5d3d2548b61dab0202250d2b77bdee3faa66d8946\": rpc error: code = NotFound desc = could not find container \"ba8e5cc6e0eb50c281c0c6c5d3d2548b61dab0202250d2b77bdee3faa66d8946\": container with ID starting with ba8e5cc6e0eb50c281c0c6c5d3d2548b61dab0202250d2b77bdee3faa66d8946 not found: ID does not exist" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.953746 4914 scope.go:117] "RemoveContainer" containerID="3dcdedd7a252f28e184edc7e2c94a23c384d032d7351cd57b145a7d038b6c587" Jan 30 22:13:32 crc kubenswrapper[4914]: E0130 22:13:32.954067 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dcdedd7a252f28e184edc7e2c94a23c384d032d7351cd57b145a7d038b6c587\": container with ID starting with 3dcdedd7a252f28e184edc7e2c94a23c384d032d7351cd57b145a7d038b6c587 not found: ID does not exist" containerID="3dcdedd7a252f28e184edc7e2c94a23c384d032d7351cd57b145a7d038b6c587" Jan 30 22:13:32 crc kubenswrapper[4914]: I0130 22:13:32.954117 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dcdedd7a252f28e184edc7e2c94a23c384d032d7351cd57b145a7d038b6c587"} err="failed to get container status \"3dcdedd7a252f28e184edc7e2c94a23c384d032d7351cd57b145a7d038b6c587\": rpc error: code = NotFound desc = could not find container \"3dcdedd7a252f28e184edc7e2c94a23c384d032d7351cd57b145a7d038b6c587\": container with ID starting with 3dcdedd7a252f28e184edc7e2c94a23c384d032d7351cd57b145a7d038b6c587 not found: ID does not exist" Jan 30 22:13:33 crc kubenswrapper[4914]: I0130 22:13:33.031911 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ckkb"] Jan 30 22:13:33 crc kubenswrapper[4914]: I0130 22:13:33.047410 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ckkb"] Jan 30 22:13:34 crc kubenswrapper[4914]: I0130 22:13:34.222422 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" path="/var/lib/kubelet/pods/0f3aae7a-21c8-4488-ae7c-3e73c065e8a8/volumes" Jan 30 22:13:34 crc kubenswrapper[4914]: I0130 22:13:34.226121 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f43fab7f-d6cf-4905-b4fa-9791d435bd79" path="/var/lib/kubelet/pods/f43fab7f-d6cf-4905-b4fa-9791d435bd79/volumes" Jan 30 22:13:35 crc kubenswrapper[4914]: I0130 22:13:35.268630 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d4cmk" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="registry-server" probeResult="failure" output=< Jan 30 22:13:35 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:13:35 crc kubenswrapper[4914]: > Jan 30 22:13:44 crc kubenswrapper[4914]: I0130 22:13:44.668127 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d4cmk" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="registry-server" probeResult="failure" output=< Jan 30 22:13:44 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:13:44 crc kubenswrapper[4914]: > Jan 30 22:13:47 crc kubenswrapper[4914]: I0130 22:13:47.760252 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-8624b_f282ec06-f396-424a-9fba-54a7a74b1831/kube-rbac-proxy/0.log" Jan 30 22:13:47 crc kubenswrapper[4914]: I0130 22:13:47.952421 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-8624b_f282ec06-f396-424a-9fba-54a7a74b1831/controller/0.log" Jan 30 22:13:48 crc kubenswrapper[4914]: I0130 22:13:48.058855 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/cp-frr-files/0.log" Jan 30 22:13:48 crc kubenswrapper[4914]: I0130 22:13:48.357318 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/cp-frr-files/0.log" Jan 30 22:13:48 crc kubenswrapper[4914]: I0130 22:13:48.376564 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/cp-reloader/0.log" Jan 30 22:13:48 crc kubenswrapper[4914]: I0130 22:13:48.424750 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/cp-reloader/0.log" Jan 30 22:13:48 crc kubenswrapper[4914]: I0130 22:13:48.427290 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/cp-metrics/0.log" Jan 30 22:13:48 crc kubenswrapper[4914]: I0130 22:13:48.678375 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/cp-metrics/0.log" Jan 30 22:13:48 crc kubenswrapper[4914]: I0130 22:13:48.681100 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/cp-metrics/0.log" Jan 30 22:13:48 crc kubenswrapper[4914]: I0130 22:13:48.684038 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/cp-reloader/0.log" Jan 30 22:13:48 crc kubenswrapper[4914]: I0130 22:13:48.688826 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/cp-frr-files/0.log" Jan 30 22:13:48 crc kubenswrapper[4914]: I0130 22:13:48.895030 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/cp-frr-files/0.log" Jan 30 22:13:48 crc kubenswrapper[4914]: I0130 22:13:48.925197 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/cp-reloader/0.log" Jan 30 22:13:48 crc kubenswrapper[4914]: I0130 22:13:48.954972 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/cp-metrics/0.log" Jan 30 22:13:48 crc kubenswrapper[4914]: I0130 22:13:48.995723 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/controller/0.log" Jan 30 22:13:49 crc kubenswrapper[4914]: I0130 22:13:49.227154 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/frr-metrics/0.log" Jan 30 22:13:49 crc kubenswrapper[4914]: I0130 22:13:49.328624 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/kube-rbac-proxy/0.log" Jan 30 22:13:49 crc kubenswrapper[4914]: I0130 22:13:49.381074 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/kube-rbac-proxy-frr/0.log" Jan 30 22:13:49 crc kubenswrapper[4914]: I0130 22:13:49.596658 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/reloader/0.log" Jan 30 22:13:49 crc kubenswrapper[4914]: I0130 22:13:49.768380 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-8tvn9_18da6d54-2a6a-4109-9927-194cc41ef5f5/frr-k8s-webhook-server/0.log" Jan 30 22:13:49 crc kubenswrapper[4914]: I0130 22:13:49.954389 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-674bc87d47-gkh5m_a34acb77-7da2-4edf-b829-d8b8ce25657e/manager/0.log" Jan 30 22:13:50 crc kubenswrapper[4914]: I0130 22:13:50.173155 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-55c6c4f998-z99nn_900c747e-7f1b-43d2-ae93-ea5e9891dcf1/webhook-server/0.log" Jan 30 22:13:50 crc kubenswrapper[4914]: I0130 22:13:50.480085 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dwwzf_165fe06d-18fe-40a3-b24d-6093aea89a4e/kube-rbac-proxy/0.log" Jan 30 22:13:50 crc kubenswrapper[4914]: I0130 22:13:50.652664 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-srzwt_aaa76eb6-e780-499d-8fb2-8eeb68cdbae8/frr/0.log" Jan 30 22:13:50 crc kubenswrapper[4914]: I0130 22:13:50.951549 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dwwzf_165fe06d-18fe-40a3-b24d-6093aea89a4e/speaker/0.log" Jan 30 22:13:54 crc kubenswrapper[4914]: I0130 22:13:54.667043 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d4cmk" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="registry-server" probeResult="failure" output=< Jan 30 22:13:54 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:13:54 crc kubenswrapper[4914]: > Jan 30 22:13:56 crc kubenswrapper[4914]: I0130 22:13:56.983323 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:13:56 crc kubenswrapper[4914]: I0130 22:13:56.984847 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:14:04 crc kubenswrapper[4914]: I0130 22:14:04.666031 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d4cmk" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="registry-server" probeResult="failure" output=< Jan 30 22:14:04 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:14:04 crc kubenswrapper[4914]: > Jan 30 22:14:05 crc kubenswrapper[4914]: I0130 22:14:05.123002 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm_c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5/util/0.log" Jan 30 22:14:05 crc kubenswrapper[4914]: I0130 22:14:05.378648 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm_c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5/pull/0.log" Jan 30 22:14:05 crc kubenswrapper[4914]: I0130 22:14:05.411393 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm_c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5/pull/0.log" Jan 30 22:14:05 crc kubenswrapper[4914]: I0130 22:14:05.438350 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm_c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5/util/0.log" Jan 30 22:14:05 crc kubenswrapper[4914]: I0130 22:14:05.637597 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm_c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5/util/0.log" Jan 30 22:14:05 crc kubenswrapper[4914]: I0130 22:14:05.649282 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm_c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5/pull/0.log" Jan 30 22:14:05 crc kubenswrapper[4914]: I0130 22:14:05.699278 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dctzrlm_c6ca6c68-3ced-4c24-9caa-7b541c7dc3a5/extract/0.log" Jan 30 22:14:05 crc kubenswrapper[4914]: I0130 22:14:05.880500 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk_e36ddd94-dd2d-41d1-b22b-805928956f0d/util/0.log" Jan 30 22:14:06 crc kubenswrapper[4914]: I0130 22:14:06.073238 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk_e36ddd94-dd2d-41d1-b22b-805928956f0d/util/0.log" Jan 30 22:14:06 crc kubenswrapper[4914]: I0130 22:14:06.095019 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk_e36ddd94-dd2d-41d1-b22b-805928956f0d/pull/0.log" Jan 30 22:14:06 crc kubenswrapper[4914]: I0130 22:14:06.134966 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk_e36ddd94-dd2d-41d1-b22b-805928956f0d/pull/0.log" Jan 30 22:14:06 crc kubenswrapper[4914]: I0130 22:14:06.389533 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk_e36ddd94-dd2d-41d1-b22b-805928956f0d/util/0.log" Jan 30 22:14:06 crc kubenswrapper[4914]: I0130 22:14:06.390583 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk_e36ddd94-dd2d-41d1-b22b-805928956f0d/pull/0.log" Jan 30 22:14:06 crc kubenswrapper[4914]: I0130 22:14:06.406298 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773b94wk_e36ddd94-dd2d-41d1-b22b-805928956f0d/extract/0.log" Jan 30 22:14:06 crc kubenswrapper[4914]: I0130 22:14:06.559272 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g_aab7512f-d12c-4b8c-b07b-45e27163ac4d/util/0.log" Jan 30 22:14:06 crc kubenswrapper[4914]: I0130 22:14:06.815402 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g_aab7512f-d12c-4b8c-b07b-45e27163ac4d/pull/0.log" Jan 30 22:14:06 crc kubenswrapper[4914]: I0130 22:14:06.820288 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g_aab7512f-d12c-4b8c-b07b-45e27163ac4d/pull/0.log" Jan 30 22:14:06 crc kubenswrapper[4914]: I0130 22:14:06.881419 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g_aab7512f-d12c-4b8c-b07b-45e27163ac4d/util/0.log" Jan 30 22:14:07 crc kubenswrapper[4914]: I0130 22:14:07.006103 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g_aab7512f-d12c-4b8c-b07b-45e27163ac4d/util/0.log" Jan 30 22:14:07 crc kubenswrapper[4914]: I0130 22:14:07.056604 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g_aab7512f-d12c-4b8c-b07b-45e27163ac4d/pull/0.log" Jan 30 22:14:07 crc kubenswrapper[4914]: I0130 22:14:07.112649 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135fc2g_aab7512f-d12c-4b8c-b07b-45e27163ac4d/extract/0.log" Jan 30 22:14:07 crc kubenswrapper[4914]: I0130 22:14:07.274208 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb_7133226d-656c-40ba-9d8b-5c0a011efb4b/util/0.log" Jan 30 22:14:07 crc kubenswrapper[4914]: I0130 22:14:07.515743 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb_7133226d-656c-40ba-9d8b-5c0a011efb4b/pull/0.log" Jan 30 22:14:07 crc kubenswrapper[4914]: I0130 22:14:07.551405 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb_7133226d-656c-40ba-9d8b-5c0a011efb4b/util/0.log" Jan 30 22:14:07 crc kubenswrapper[4914]: I0130 22:14:07.565840 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb_7133226d-656c-40ba-9d8b-5c0a011efb4b/pull/0.log" Jan 30 22:14:07 crc kubenswrapper[4914]: I0130 22:14:07.740669 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb_7133226d-656c-40ba-9d8b-5c0a011efb4b/pull/0.log" Jan 30 22:14:07 crc kubenswrapper[4914]: I0130 22:14:07.761024 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb_7133226d-656c-40ba-9d8b-5c0a011efb4b/util/0.log" Jan 30 22:14:07 crc kubenswrapper[4914]: I0130 22:14:07.854334 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08h4pkb_7133226d-656c-40ba-9d8b-5c0a011efb4b/extract/0.log" Jan 30 22:14:08 crc kubenswrapper[4914]: I0130 22:14:08.009164 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4hvtw_5cd8e563-19a1-460d-8a83-1b0d22d6212d/extract-utilities/0.log" Jan 30 22:14:08 crc kubenswrapper[4914]: I0130 22:14:08.161699 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4hvtw_5cd8e563-19a1-460d-8a83-1b0d22d6212d/extract-utilities/0.log" Jan 30 22:14:08 crc kubenswrapper[4914]: I0130 22:14:08.176549 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4hvtw_5cd8e563-19a1-460d-8a83-1b0d22d6212d/extract-content/0.log" Jan 30 22:14:08 crc kubenswrapper[4914]: I0130 22:14:08.215092 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4hvtw_5cd8e563-19a1-460d-8a83-1b0d22d6212d/extract-content/0.log" Jan 30 22:14:08 crc kubenswrapper[4914]: I0130 22:14:08.448928 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4hvtw_5cd8e563-19a1-460d-8a83-1b0d22d6212d/extract-content/0.log" Jan 30 22:14:08 crc kubenswrapper[4914]: I0130 22:14:08.551206 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4hvtw_5cd8e563-19a1-460d-8a83-1b0d22d6212d/extract-utilities/0.log" Jan 30 22:14:08 crc kubenswrapper[4914]: I0130 22:14:08.892431 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-28dfs_5a28cf6d-31c5-4884-a167-2725c6700e42/extract-utilities/0.log" Jan 30 22:14:09 crc kubenswrapper[4914]: I0130 22:14:09.019347 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4hvtw_5cd8e563-19a1-460d-8a83-1b0d22d6212d/registry-server/0.log" Jan 30 22:14:09 crc kubenswrapper[4914]: I0130 22:14:09.154577 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-28dfs_5a28cf6d-31c5-4884-a167-2725c6700e42/extract-content/0.log" Jan 30 22:14:09 crc kubenswrapper[4914]: I0130 22:14:09.161549 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-28dfs_5a28cf6d-31c5-4884-a167-2725c6700e42/extract-content/0.log" Jan 30 22:14:09 crc kubenswrapper[4914]: I0130 22:14:09.187374 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-28dfs_5a28cf6d-31c5-4884-a167-2725c6700e42/extract-utilities/0.log" Jan 30 22:14:09 crc kubenswrapper[4914]: I0130 22:14:09.385291 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-28dfs_5a28cf6d-31c5-4884-a167-2725c6700e42/extract-utilities/0.log" Jan 30 22:14:09 crc kubenswrapper[4914]: I0130 22:14:09.477102 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-28dfs_5a28cf6d-31c5-4884-a167-2725c6700e42/extract-content/0.log" Jan 30 22:14:09 crc kubenswrapper[4914]: I0130 22:14:09.631460 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-hzn7g_5d539af9-a76b-48db-b786-ab58c8e1d2cf/marketplace-operator/0.log" Jan 30 22:14:09 crc kubenswrapper[4914]: I0130 22:14:09.744481 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mspfz_f146a773-aaa7-4818-bb38-6547863767d5/extract-utilities/0.log" Jan 30 22:14:09 crc kubenswrapper[4914]: I0130 22:14:09.995168 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mspfz_f146a773-aaa7-4818-bb38-6547863767d5/extract-content/0.log" Jan 30 22:14:10 crc kubenswrapper[4914]: I0130 22:14:10.038325 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mspfz_f146a773-aaa7-4818-bb38-6547863767d5/extract-content/0.log" Jan 30 22:14:10 crc kubenswrapper[4914]: I0130 22:14:10.038451 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mspfz_f146a773-aaa7-4818-bb38-6547863767d5/extract-utilities/0.log" Jan 30 22:14:10 crc kubenswrapper[4914]: I0130 22:14:10.148628 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-28dfs_5a28cf6d-31c5-4884-a167-2725c6700e42/registry-server/0.log" Jan 30 22:14:10 crc kubenswrapper[4914]: I0130 22:14:10.281736 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mspfz_f146a773-aaa7-4818-bb38-6547863767d5/extract-utilities/0.log" Jan 30 22:14:10 crc kubenswrapper[4914]: I0130 22:14:10.292957 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mspfz_f146a773-aaa7-4818-bb38-6547863767d5/extract-content/0.log" Jan 30 22:14:10 crc kubenswrapper[4914]: I0130 22:14:10.388683 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2zb4n_870932da-afd7-4695-ab66-3726c700fea4/extract-utilities/0.log" Jan 30 22:14:10 crc kubenswrapper[4914]: I0130 22:14:10.464559 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mspfz_f146a773-aaa7-4818-bb38-6547863767d5/registry-server/0.log" Jan 30 22:14:10 crc kubenswrapper[4914]: I0130 22:14:10.605475 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2zb4n_870932da-afd7-4695-ab66-3726c700fea4/extract-utilities/0.log" Jan 30 22:14:10 crc kubenswrapper[4914]: I0130 22:14:10.662247 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2zb4n_870932da-afd7-4695-ab66-3726c700fea4/extract-content/0.log" Jan 30 22:14:10 crc kubenswrapper[4914]: I0130 22:14:10.679218 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2zb4n_870932da-afd7-4695-ab66-3726c700fea4/extract-content/0.log" Jan 30 22:14:10 crc kubenswrapper[4914]: I0130 22:14:10.862530 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2zb4n_870932da-afd7-4695-ab66-3726c700fea4/extract-utilities/0.log" Jan 30 22:14:10 crc kubenswrapper[4914]: I0130 22:14:10.889136 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2zb4n_870932da-afd7-4695-ab66-3726c700fea4/extract-content/0.log" Jan 30 22:14:10 crc kubenswrapper[4914]: I0130 22:14:10.917612 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d4cmk_562f7775-6d1d-4768-b1f8-38bbaff02906/extract-utilities/0.log" Jan 30 22:14:11 crc kubenswrapper[4914]: I0130 22:14:11.260264 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d4cmk_562f7775-6d1d-4768-b1f8-38bbaff02906/extract-content/0.log" Jan 30 22:14:11 crc kubenswrapper[4914]: I0130 22:14:11.320740 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d4cmk_562f7775-6d1d-4768-b1f8-38bbaff02906/extract-content/0.log" Jan 30 22:14:11 crc kubenswrapper[4914]: I0130 22:14:11.371206 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d4cmk_562f7775-6d1d-4768-b1f8-38bbaff02906/extract-utilities/0.log" Jan 30 22:14:11 crc kubenswrapper[4914]: I0130 22:14:11.463537 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2zb4n_870932da-afd7-4695-ab66-3726c700fea4/registry-server/0.log" Jan 30 22:14:11 crc kubenswrapper[4914]: I0130 22:14:11.562997 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d4cmk_562f7775-6d1d-4768-b1f8-38bbaff02906/extract-content/0.log" Jan 30 22:14:11 crc kubenswrapper[4914]: I0130 22:14:11.567899 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d4cmk_562f7775-6d1d-4768-b1f8-38bbaff02906/extract-utilities/0.log" Jan 30 22:14:11 crc kubenswrapper[4914]: I0130 22:14:11.575916 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d4cmk_562f7775-6d1d-4768-b1f8-38bbaff02906/registry-server/0.log" Jan 30 22:14:14 crc kubenswrapper[4914]: I0130 22:14:14.655444 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d4cmk" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="registry-server" probeResult="failure" output=< Jan 30 22:14:14 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:14:14 crc kubenswrapper[4914]: > Jan 30 22:14:24 crc kubenswrapper[4914]: I0130 22:14:24.664201 4914 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d4cmk" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="registry-server" probeResult="failure" output=< Jan 30 22:14:24 crc kubenswrapper[4914]: timeout: failed to connect service ":50051" within 1s Jan 30 22:14:24 crc kubenswrapper[4914]: > Jan 30 22:14:25 crc kubenswrapper[4914]: I0130 22:14:25.063819 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5b5c6685f6-c28gb_0c6a7dac-bcc3-4acc-a5c0-aa26c17e28ae/prometheus-operator-admission-webhook/0.log" Jan 30 22:14:25 crc kubenswrapper[4914]: I0130 22:14:25.085089 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-9752c_1633e963-9082-4659-af26-20bc3b1e512b/prometheus-operator/0.log" Jan 30 22:14:25 crc kubenswrapper[4914]: I0130 22:14:25.322683 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5b5c6685f6-lmfbz_128f28df-6fdd-4a2c-86af-6dfe33baf2c9/prometheus-operator-admission-webhook/0.log" Jan 30 22:14:25 crc kubenswrapper[4914]: I0130 22:14:25.366846 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-6c2k7_1b8e78e3-d709-4289-b9aa-15a6270a66d0/perses-operator/0.log" Jan 30 22:14:25 crc kubenswrapper[4914]: I0130 22:14:25.370838 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-gf6nm_0f547924-f70d-41f4-8461-57953f81d9ac/operator/0.log" Jan 30 22:14:26 crc kubenswrapper[4914]: I0130 22:14:26.983396 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:14:26 crc kubenswrapper[4914]: I0130 22:14:26.983459 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:14:33 crc kubenswrapper[4914]: I0130 22:14:33.668175 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:14:33 crc kubenswrapper[4914]: I0130 22:14:33.755211 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:14:33 crc kubenswrapper[4914]: I0130 22:14:33.915078 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d4cmk"] Jan 30 22:14:34 crc kubenswrapper[4914]: I0130 22:14:34.828363 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d4cmk" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="registry-server" containerID="cri-o://db5d8e770b3f0a4fb7f4c0372d09c0b95d1e33f16fb592a615ff4385f8a851e1" gracePeriod=2 Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.741270 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.840101 4914 generic.go:334] "Generic (PLEG): container finished" podID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerID="db5d8e770b3f0a4fb7f4c0372d09c0b95d1e33f16fb592a615ff4385f8a851e1" exitCode=0 Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.840162 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4cmk" event={"ID":"562f7775-6d1d-4768-b1f8-38bbaff02906","Type":"ContainerDied","Data":"db5d8e770b3f0a4fb7f4c0372d09c0b95d1e33f16fb592a615ff4385f8a851e1"} Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.840197 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4cmk" event={"ID":"562f7775-6d1d-4768-b1f8-38bbaff02906","Type":"ContainerDied","Data":"e55ed798f61218d87b0abc16dcf91e01116459e88b3c3b99331d9cf6970d07d4"} Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.840219 4914 scope.go:117] "RemoveContainer" containerID="db5d8e770b3f0a4fb7f4c0372d09c0b95d1e33f16fb592a615ff4385f8a851e1" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.840394 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4cmk" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.847229 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/562f7775-6d1d-4768-b1f8-38bbaff02906-utilities\") pod \"562f7775-6d1d-4768-b1f8-38bbaff02906\" (UID: \"562f7775-6d1d-4768-b1f8-38bbaff02906\") " Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.847318 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/562f7775-6d1d-4768-b1f8-38bbaff02906-catalog-content\") pod \"562f7775-6d1d-4768-b1f8-38bbaff02906\" (UID: \"562f7775-6d1d-4768-b1f8-38bbaff02906\") " Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.847573 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm2nr\" (UniqueName: \"kubernetes.io/projected/562f7775-6d1d-4768-b1f8-38bbaff02906-kube-api-access-zm2nr\") pod \"562f7775-6d1d-4768-b1f8-38bbaff02906\" (UID: \"562f7775-6d1d-4768-b1f8-38bbaff02906\") " Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.848269 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/562f7775-6d1d-4768-b1f8-38bbaff02906-utilities" (OuterVolumeSpecName: "utilities") pod "562f7775-6d1d-4768-b1f8-38bbaff02906" (UID: "562f7775-6d1d-4768-b1f8-38bbaff02906"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.864835 4914 scope.go:117] "RemoveContainer" containerID="a58650dc11051c122629ae9f8bcdb72a1eca7bd64a1fe6beed8129c6c33f5821" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.869936 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/562f7775-6d1d-4768-b1f8-38bbaff02906-kube-api-access-zm2nr" (OuterVolumeSpecName: "kube-api-access-zm2nr") pod "562f7775-6d1d-4768-b1f8-38bbaff02906" (UID: "562f7775-6d1d-4768-b1f8-38bbaff02906"). InnerVolumeSpecName "kube-api-access-zm2nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.928340 4914 scope.go:117] "RemoveContainer" containerID="c1631801878ce23639de3a89a1b1c6ef768bc7f87f7ae7d3424a128e497da677" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.950342 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm2nr\" (UniqueName: \"kubernetes.io/projected/562f7775-6d1d-4768-b1f8-38bbaff02906-kube-api-access-zm2nr\") on node \"crc\" DevicePath \"\"" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.950406 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/562f7775-6d1d-4768-b1f8-38bbaff02906-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.981097 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/562f7775-6d1d-4768-b1f8-38bbaff02906-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "562f7775-6d1d-4768-b1f8-38bbaff02906" (UID: "562f7775-6d1d-4768-b1f8-38bbaff02906"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.989081 4914 scope.go:117] "RemoveContainer" containerID="db5d8e770b3f0a4fb7f4c0372d09c0b95d1e33f16fb592a615ff4385f8a851e1" Jan 30 22:14:35 crc kubenswrapper[4914]: E0130 22:14:35.989696 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db5d8e770b3f0a4fb7f4c0372d09c0b95d1e33f16fb592a615ff4385f8a851e1\": container with ID starting with db5d8e770b3f0a4fb7f4c0372d09c0b95d1e33f16fb592a615ff4385f8a851e1 not found: ID does not exist" containerID="db5d8e770b3f0a4fb7f4c0372d09c0b95d1e33f16fb592a615ff4385f8a851e1" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.989763 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5d8e770b3f0a4fb7f4c0372d09c0b95d1e33f16fb592a615ff4385f8a851e1"} err="failed to get container status \"db5d8e770b3f0a4fb7f4c0372d09c0b95d1e33f16fb592a615ff4385f8a851e1\": rpc error: code = NotFound desc = could not find container \"db5d8e770b3f0a4fb7f4c0372d09c0b95d1e33f16fb592a615ff4385f8a851e1\": container with ID starting with db5d8e770b3f0a4fb7f4c0372d09c0b95d1e33f16fb592a615ff4385f8a851e1 not found: ID does not exist" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.989796 4914 scope.go:117] "RemoveContainer" containerID="a58650dc11051c122629ae9f8bcdb72a1eca7bd64a1fe6beed8129c6c33f5821" Jan 30 22:14:35 crc kubenswrapper[4914]: E0130 22:14:35.990237 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a58650dc11051c122629ae9f8bcdb72a1eca7bd64a1fe6beed8129c6c33f5821\": container with ID starting with a58650dc11051c122629ae9f8bcdb72a1eca7bd64a1fe6beed8129c6c33f5821 not found: ID does not exist" containerID="a58650dc11051c122629ae9f8bcdb72a1eca7bd64a1fe6beed8129c6c33f5821" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.990309 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a58650dc11051c122629ae9f8bcdb72a1eca7bd64a1fe6beed8129c6c33f5821"} err="failed to get container status \"a58650dc11051c122629ae9f8bcdb72a1eca7bd64a1fe6beed8129c6c33f5821\": rpc error: code = NotFound desc = could not find container \"a58650dc11051c122629ae9f8bcdb72a1eca7bd64a1fe6beed8129c6c33f5821\": container with ID starting with a58650dc11051c122629ae9f8bcdb72a1eca7bd64a1fe6beed8129c6c33f5821 not found: ID does not exist" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.990367 4914 scope.go:117] "RemoveContainer" containerID="c1631801878ce23639de3a89a1b1c6ef768bc7f87f7ae7d3424a128e497da677" Jan 30 22:14:35 crc kubenswrapper[4914]: E0130 22:14:35.990845 4914 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1631801878ce23639de3a89a1b1c6ef768bc7f87f7ae7d3424a128e497da677\": container with ID starting with c1631801878ce23639de3a89a1b1c6ef768bc7f87f7ae7d3424a128e497da677 not found: ID does not exist" containerID="c1631801878ce23639de3a89a1b1c6ef768bc7f87f7ae7d3424a128e497da677" Jan 30 22:14:35 crc kubenswrapper[4914]: I0130 22:14:35.990891 4914 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1631801878ce23639de3a89a1b1c6ef768bc7f87f7ae7d3424a128e497da677"} err="failed to get container status \"c1631801878ce23639de3a89a1b1c6ef768bc7f87f7ae7d3424a128e497da677\": rpc error: code = NotFound desc = could not find container \"c1631801878ce23639de3a89a1b1c6ef768bc7f87f7ae7d3424a128e497da677\": container with ID starting with c1631801878ce23639de3a89a1b1c6ef768bc7f87f7ae7d3424a128e497da677 not found: ID does not exist" Jan 30 22:14:36 crc kubenswrapper[4914]: I0130 22:14:36.052885 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/562f7775-6d1d-4768-b1f8-38bbaff02906-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:14:36 crc kubenswrapper[4914]: I0130 22:14:36.178644 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d4cmk"] Jan 30 22:14:36 crc kubenswrapper[4914]: I0130 22:14:36.190480 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d4cmk"] Jan 30 22:14:37 crc kubenswrapper[4914]: I0130 22:14:37.831462 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" path="/var/lib/kubelet/pods/562f7775-6d1d-4768-b1f8-38bbaff02906/volumes" Jan 30 22:14:39 crc kubenswrapper[4914]: I0130 22:14:39.161083 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6cc9c48657-sbpc5_863c64b0-0be9-464d-973a-2bbfc89a6ff0/kube-rbac-proxy/0.log" Jan 30 22:14:39 crc kubenswrapper[4914]: I0130 22:14:39.298071 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6cc9c48657-sbpc5_863c64b0-0be9-464d-973a-2bbfc89a6ff0/manager/0.log" Jan 30 22:14:56 crc kubenswrapper[4914]: I0130 22:14:56.983974 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:14:56 crc kubenswrapper[4914]: I0130 22:14:56.985847 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:14:56 crc kubenswrapper[4914]: I0130 22:14:56.985982 4914 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 22:14:56 crc kubenswrapper[4914]: I0130 22:14:56.986988 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25c099c982ea12314f9b5510223aeafb2aa3a30f2bbddfee53e26959ce1db558"} pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 22:14:56 crc kubenswrapper[4914]: I0130 22:14:56.987143 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" containerID="cri-o://25c099c982ea12314f9b5510223aeafb2aa3a30f2bbddfee53e26959ce1db558" gracePeriod=600 Jan 30 22:14:58 crc kubenswrapper[4914]: I0130 22:14:58.108107 4914 generic.go:334] "Generic (PLEG): container finished" podID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerID="25c099c982ea12314f9b5510223aeafb2aa3a30f2bbddfee53e26959ce1db558" exitCode=0 Jan 30 22:14:58 crc kubenswrapper[4914]: I0130 22:14:58.108282 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerDied","Data":"25c099c982ea12314f9b5510223aeafb2aa3a30f2bbddfee53e26959ce1db558"} Jan 30 22:14:58 crc kubenswrapper[4914]: I0130 22:14:58.108641 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerStarted","Data":"b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0"} Jan 30 22:14:58 crc kubenswrapper[4914]: I0130 22:14:58.108670 4914 scope.go:117] "RemoveContainer" containerID="22e5b573dfbffbdaffc7408be7c531cc76c63d5bd6befca47e141409bd54e650" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.200286 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk"] Jan 30 22:15:00 crc kubenswrapper[4914]: E0130 22:15:00.202657 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43fab7f-d6cf-4905-b4fa-9791d435bd79" containerName="registry-server" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.202802 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43fab7f-d6cf-4905-b4fa-9791d435bd79" containerName="registry-server" Jan 30 22:15:00 crc kubenswrapper[4914]: E0130 22:15:00.202892 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="registry-server" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.202958 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="registry-server" Jan 30 22:15:00 crc kubenswrapper[4914]: E0130 22:15:00.203026 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43fab7f-d6cf-4905-b4fa-9791d435bd79" containerName="extract-content" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.203110 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43fab7f-d6cf-4905-b4fa-9791d435bd79" containerName="extract-content" Jan 30 22:15:00 crc kubenswrapper[4914]: E0130 22:15:00.203188 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" containerName="extract-content" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.203255 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" containerName="extract-content" Jan 30 22:15:00 crc kubenswrapper[4914]: E0130 22:15:00.203351 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="extract-utilities" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.203444 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="extract-utilities" Jan 30 22:15:00 crc kubenswrapper[4914]: E0130 22:15:00.203543 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" containerName="extract-utilities" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.203609 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" containerName="extract-utilities" Jan 30 22:15:00 crc kubenswrapper[4914]: E0130 22:15:00.203684 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="extract-content" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.203767 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="extract-content" Jan 30 22:15:00 crc kubenswrapper[4914]: E0130 22:15:00.203840 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43fab7f-d6cf-4905-b4fa-9791d435bd79" containerName="extract-utilities" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.203910 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43fab7f-d6cf-4905-b4fa-9791d435bd79" containerName="extract-utilities" Jan 30 22:15:00 crc kubenswrapper[4914]: E0130 22:15:00.203986 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" containerName="registry-server" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.204055 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" containerName="registry-server" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.204437 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3aae7a-21c8-4488-ae7c-3e73c065e8a8" containerName="registry-server" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.204530 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="562f7775-6d1d-4768-b1f8-38bbaff02906" containerName="registry-server" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.204603 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43fab7f-d6cf-4905-b4fa-9791d435bd79" containerName="registry-server" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.205740 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.208899 4914 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.220305 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk"] Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.226424 4914 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.273370 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-secret-volume\") pod \"collect-profiles-29496855-nf7rk\" (UID: \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.273546 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-config-volume\") pod \"collect-profiles-29496855-nf7rk\" (UID: \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.273586 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcps7\" (UniqueName: \"kubernetes.io/projected/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-kube-api-access-dcps7\") pod \"collect-profiles-29496855-nf7rk\" (UID: \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.376584 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-secret-volume\") pod \"collect-profiles-29496855-nf7rk\" (UID: \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.376973 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-config-volume\") pod \"collect-profiles-29496855-nf7rk\" (UID: \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.377008 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcps7\" (UniqueName: \"kubernetes.io/projected/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-kube-api-access-dcps7\") pod \"collect-profiles-29496855-nf7rk\" (UID: \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.378085 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-config-volume\") pod \"collect-profiles-29496855-nf7rk\" (UID: \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.387823 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-secret-volume\") pod \"collect-profiles-29496855-nf7rk\" (UID: \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.409964 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcps7\" (UniqueName: \"kubernetes.io/projected/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-kube-api-access-dcps7\") pod \"collect-profiles-29496855-nf7rk\" (UID: \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" Jan 30 22:15:00 crc kubenswrapper[4914]: I0130 22:15:00.536872 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" Jan 30 22:15:01 crc kubenswrapper[4914]: I0130 22:15:01.103556 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk"] Jan 30 22:15:01 crc kubenswrapper[4914]: I0130 22:15:01.186760 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" event={"ID":"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1","Type":"ContainerStarted","Data":"56f3fbc87b73a48cd8dfcb30625483273b95f4f2540ee508d6de97c091e2d476"} Jan 30 22:15:02 crc kubenswrapper[4914]: I0130 22:15:02.199069 4914 generic.go:334] "Generic (PLEG): container finished" podID="452cdc5c-9bda-42fd-a2a8-21a1f15e83b1" containerID="d09dab763fa54a27909c308735ee4db7a4673d7d3892eabced2b4e2aea54fc44" exitCode=0 Jan 30 22:15:02 crc kubenswrapper[4914]: I0130 22:15:02.199114 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" event={"ID":"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1","Type":"ContainerDied","Data":"d09dab763fa54a27909c308735ee4db7a4673d7d3892eabced2b4e2aea54fc44"} Jan 30 22:15:03 crc kubenswrapper[4914]: I0130 22:15:03.873044 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" Jan 30 22:15:03 crc kubenswrapper[4914]: I0130 22:15:03.967511 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcps7\" (UniqueName: \"kubernetes.io/projected/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-kube-api-access-dcps7\") pod \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\" (UID: \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\") " Jan 30 22:15:03 crc kubenswrapper[4914]: I0130 22:15:03.967719 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-secret-volume\") pod \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\" (UID: \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\") " Jan 30 22:15:03 crc kubenswrapper[4914]: I0130 22:15:03.967763 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-config-volume\") pod \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\" (UID: \"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1\") " Jan 30 22:15:03 crc kubenswrapper[4914]: I0130 22:15:03.968611 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-config-volume" (OuterVolumeSpecName: "config-volume") pod "452cdc5c-9bda-42fd-a2a8-21a1f15e83b1" (UID: "452cdc5c-9bda-42fd-a2a8-21a1f15e83b1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 22:15:03 crc kubenswrapper[4914]: I0130 22:15:03.976890 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-kube-api-access-dcps7" (OuterVolumeSpecName: "kube-api-access-dcps7") pod "452cdc5c-9bda-42fd-a2a8-21a1f15e83b1" (UID: "452cdc5c-9bda-42fd-a2a8-21a1f15e83b1"). InnerVolumeSpecName "kube-api-access-dcps7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:15:03 crc kubenswrapper[4914]: I0130 22:15:03.995487 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "452cdc5c-9bda-42fd-a2a8-21a1f15e83b1" (UID: "452cdc5c-9bda-42fd-a2a8-21a1f15e83b1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:15:04 crc kubenswrapper[4914]: I0130 22:15:04.072364 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcps7\" (UniqueName: \"kubernetes.io/projected/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-kube-api-access-dcps7\") on node \"crc\" DevicePath \"\"" Jan 30 22:15:04 crc kubenswrapper[4914]: I0130 22:15:04.072430 4914 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 22:15:04 crc kubenswrapper[4914]: I0130 22:15:04.072444 4914 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/452cdc5c-9bda-42fd-a2a8-21a1f15e83b1-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 22:15:04 crc kubenswrapper[4914]: I0130 22:15:04.219685 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" event={"ID":"452cdc5c-9bda-42fd-a2a8-21a1f15e83b1","Type":"ContainerDied","Data":"56f3fbc87b73a48cd8dfcb30625483273b95f4f2540ee508d6de97c091e2d476"} Jan 30 22:15:04 crc kubenswrapper[4914]: I0130 22:15:04.219740 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-nf7rk" Jan 30 22:15:04 crc kubenswrapper[4914]: I0130 22:15:04.219744 4914 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56f3fbc87b73a48cd8dfcb30625483273b95f4f2540ee508d6de97c091e2d476" Jan 30 22:15:04 crc kubenswrapper[4914]: I0130 22:15:04.977642 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz"] Jan 30 22:15:04 crc kubenswrapper[4914]: I0130 22:15:04.991127 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496810-22trz"] Jan 30 22:15:05 crc kubenswrapper[4914]: I0130 22:15:05.844773 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11441868-74c0-4ddc-af04-146296bfe8ed" path="/var/lib/kubelet/pods/11441868-74c0-4ddc-af04-146296bfe8ed/volumes" Jan 30 22:15:44 crc kubenswrapper[4914]: I0130 22:15:44.998912 4914 scope.go:117] "RemoveContainer" containerID="2ec1467c72479f629d162d7d1829fb762555b9a3db9b0116a9d95f436e880a34" Jan 30 22:15:45 crc kubenswrapper[4914]: I0130 22:15:45.022500 4914 scope.go:117] "RemoveContainer" containerID="cef95861187d59ad6c1dfbf677e944d468dd9847157e32103631e77ed82f581e" Jan 30 22:16:45 crc kubenswrapper[4914]: I0130 22:16:45.241462 4914 scope.go:117] "RemoveContainer" containerID="2642efdcafc7bbfd193144bdf702365e42ba636ae59305303e4a071216110202" Jan 30 22:16:46 crc kubenswrapper[4914]: I0130 22:16:46.235361 4914 generic.go:334] "Generic (PLEG): container finished" podID="3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0" containerID="ec2d60d06fb69d2da722174e44a6fab6b505a322c4c38903414d014e32743ff3" exitCode=0 Jan 30 22:16:46 crc kubenswrapper[4914]: I0130 22:16:46.235638 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gk9h/must-gather-4w8kz" event={"ID":"3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0","Type":"ContainerDied","Data":"ec2d60d06fb69d2da722174e44a6fab6b505a322c4c38903414d014e32743ff3"} Jan 30 22:16:46 crc kubenswrapper[4914]: I0130 22:16:46.236344 4914 scope.go:117] "RemoveContainer" containerID="ec2d60d06fb69d2da722174e44a6fab6b505a322c4c38903414d014e32743ff3" Jan 30 22:16:46 crc kubenswrapper[4914]: I0130 22:16:46.317006 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5gk9h_must-gather-4w8kz_3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0/gather/0.log" Jan 30 22:16:55 crc kubenswrapper[4914]: I0130 22:16:55.232175 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5gk9h/must-gather-4w8kz"] Jan 30 22:16:55 crc kubenswrapper[4914]: I0130 22:16:55.233083 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5gk9h/must-gather-4w8kz" podUID="3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0" containerName="copy" containerID="cri-o://94313841409ce012de550bd827b78b36f34a05de1b22865084411f73e9a5e51e" gracePeriod=2 Jan 30 22:16:55 crc kubenswrapper[4914]: I0130 22:16:55.241246 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5gk9h/must-gather-4w8kz"] Jan 30 22:16:56 crc kubenswrapper[4914]: I0130 22:16:56.345642 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5gk9h_must-gather-4w8kz_3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0/copy/0.log" Jan 30 22:16:56 crc kubenswrapper[4914]: I0130 22:16:56.346244 4914 generic.go:334] "Generic (PLEG): container finished" podID="3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0" containerID="94313841409ce012de550bd827b78b36f34a05de1b22865084411f73e9a5e51e" exitCode=143 Jan 30 22:16:56 crc kubenswrapper[4914]: I0130 22:16:56.628056 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5gk9h_must-gather-4w8kz_3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0/copy/0.log" Jan 30 22:16:56 crc kubenswrapper[4914]: I0130 22:16:56.628734 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/must-gather-4w8kz" Jan 30 22:16:56 crc kubenswrapper[4914]: I0130 22:16:56.709022 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0-must-gather-output\") pod \"3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0\" (UID: \"3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0\") " Jan 30 22:16:56 crc kubenswrapper[4914]: I0130 22:16:56.709226 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbmmt\" (UniqueName: \"kubernetes.io/projected/3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0-kube-api-access-xbmmt\") pod \"3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0\" (UID: \"3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0\") " Jan 30 22:16:56 crc kubenswrapper[4914]: I0130 22:16:56.724121 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0-kube-api-access-xbmmt" (OuterVolumeSpecName: "kube-api-access-xbmmt") pod "3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0" (UID: "3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0"). InnerVolumeSpecName "kube-api-access-xbmmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:16:56 crc kubenswrapper[4914]: I0130 22:16:56.811355 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbmmt\" (UniqueName: \"kubernetes.io/projected/3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0-kube-api-access-xbmmt\") on node \"crc\" DevicePath \"\"" Jan 30 22:16:56 crc kubenswrapper[4914]: I0130 22:16:56.923608 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0" (UID: "3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:16:57 crc kubenswrapper[4914]: I0130 22:16:57.016165 4914 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 30 22:16:57 crc kubenswrapper[4914]: I0130 22:16:57.357198 4914 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5gk9h_must-gather-4w8kz_3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0/copy/0.log" Jan 30 22:16:57 crc kubenswrapper[4914]: I0130 22:16:57.358473 4914 scope.go:117] "RemoveContainer" containerID="94313841409ce012de550bd827b78b36f34a05de1b22865084411f73e9a5e51e" Jan 30 22:16:57 crc kubenswrapper[4914]: I0130 22:16:57.358524 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gk9h/must-gather-4w8kz" Jan 30 22:16:57 crc kubenswrapper[4914]: I0130 22:16:57.381081 4914 scope.go:117] "RemoveContainer" containerID="ec2d60d06fb69d2da722174e44a6fab6b505a322c4c38903414d014e32743ff3" Jan 30 22:16:57 crc kubenswrapper[4914]: I0130 22:16:57.828637 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0" path="/var/lib/kubelet/pods/3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0/volumes" Jan 30 22:17:26 crc kubenswrapper[4914]: I0130 22:17:26.983376 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:17:26 crc kubenswrapper[4914]: I0130 22:17:26.983913 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:17:56 crc kubenswrapper[4914]: I0130 22:17:56.983357 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:17:56 crc kubenswrapper[4914]: I0130 22:17:56.983929 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:18:26 crc kubenswrapper[4914]: I0130 22:18:26.983098 4914 patch_prober.go:28] interesting pod/machine-config-daemon-pm2tg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:18:26 crc kubenswrapper[4914]: I0130 22:18:26.983639 4914 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:18:26 crc kubenswrapper[4914]: I0130 22:18:26.983687 4914 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" Jan 30 22:18:26 crc kubenswrapper[4914]: I0130 22:18:26.984698 4914 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0"} pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 22:18:26 crc kubenswrapper[4914]: I0130 22:18:26.984793 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerName="machine-config-daemon" containerID="cri-o://b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" gracePeriod=600 Jan 30 22:18:27 crc kubenswrapper[4914]: E0130 22:18:27.127764 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:18:27 crc kubenswrapper[4914]: I0130 22:18:27.261820 4914 generic.go:334] "Generic (PLEG): container finished" podID="3be0c366-7d83-42e6-9a85-3f77ce72281f" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" exitCode=0 Jan 30 22:18:27 crc kubenswrapper[4914]: I0130 22:18:27.261889 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" event={"ID":"3be0c366-7d83-42e6-9a85-3f77ce72281f","Type":"ContainerDied","Data":"b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0"} Jan 30 22:18:27 crc kubenswrapper[4914]: I0130 22:18:27.261936 4914 scope.go:117] "RemoveContainer" containerID="25c099c982ea12314f9b5510223aeafb2aa3a30f2bbddfee53e26959ce1db558" Jan 30 22:18:27 crc kubenswrapper[4914]: I0130 22:18:27.262902 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:18:27 crc kubenswrapper[4914]: E0130 22:18:27.263240 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:18:39 crc kubenswrapper[4914]: I0130 22:18:39.818604 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:18:39 crc kubenswrapper[4914]: E0130 22:18:39.819364 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:18:51 crc kubenswrapper[4914]: I0130 22:18:51.819375 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:18:51 crc kubenswrapper[4914]: E0130 22:18:51.820212 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:19:04 crc kubenswrapper[4914]: I0130 22:19:04.818477 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:19:04 crc kubenswrapper[4914]: E0130 22:19:04.819352 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:19:16 crc kubenswrapper[4914]: I0130 22:19:16.817975 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:19:16 crc kubenswrapper[4914]: E0130 22:19:16.818759 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:19:31 crc kubenswrapper[4914]: I0130 22:19:31.818355 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:19:31 crc kubenswrapper[4914]: E0130 22:19:31.819281 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:19:44 crc kubenswrapper[4914]: I0130 22:19:44.819253 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:19:44 crc kubenswrapper[4914]: E0130 22:19:44.820299 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:19:55 crc kubenswrapper[4914]: I0130 22:19:55.819149 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:19:55 crc kubenswrapper[4914]: E0130 22:19:55.820182 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:20:10 crc kubenswrapper[4914]: I0130 22:20:10.818659 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:20:10 crc kubenswrapper[4914]: E0130 22:20:10.819594 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:20:22 crc kubenswrapper[4914]: I0130 22:20:22.818156 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:20:22 crc kubenswrapper[4914]: E0130 22:20:22.819164 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.519116 4914 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8gnvl"] Jan 30 22:20:29 crc kubenswrapper[4914]: E0130 22:20:29.520656 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0" containerName="copy" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.520678 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0" containerName="copy" Jan 30 22:20:29 crc kubenswrapper[4914]: E0130 22:20:29.520697 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452cdc5c-9bda-42fd-a2a8-21a1f15e83b1" containerName="collect-profiles" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.520718 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="452cdc5c-9bda-42fd-a2a8-21a1f15e83b1" containerName="collect-profiles" Jan 30 22:20:29 crc kubenswrapper[4914]: E0130 22:20:29.520749 4914 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0" containerName="gather" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.520756 4914 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0" containerName="gather" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.521033 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0" containerName="copy" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.521055 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="452cdc5c-9bda-42fd-a2a8-21a1f15e83b1" containerName="collect-profiles" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.521079 4914 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dca3bbb-e0c3-4ef4-b2e1-08e1102c79e0" containerName="gather" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.523181 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.536774 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8gnvl"] Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.620107 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br5c9\" (UniqueName: \"kubernetes.io/projected/94b9ace6-42a9-4565-b61c-4f3fd07476a0-kube-api-access-br5c9\") pod \"community-operators-8gnvl\" (UID: \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\") " pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.620271 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b9ace6-42a9-4565-b61c-4f3fd07476a0-utilities\") pod \"community-operators-8gnvl\" (UID: \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\") " pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.620307 4914 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b9ace6-42a9-4565-b61c-4f3fd07476a0-catalog-content\") pod \"community-operators-8gnvl\" (UID: \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\") " pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.722730 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br5c9\" (UniqueName: \"kubernetes.io/projected/94b9ace6-42a9-4565-b61c-4f3fd07476a0-kube-api-access-br5c9\") pod \"community-operators-8gnvl\" (UID: \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\") " pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.722936 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b9ace6-42a9-4565-b61c-4f3fd07476a0-utilities\") pod \"community-operators-8gnvl\" (UID: \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\") " pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.722973 4914 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b9ace6-42a9-4565-b61c-4f3fd07476a0-catalog-content\") pod \"community-operators-8gnvl\" (UID: \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\") " pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.723664 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b9ace6-42a9-4565-b61c-4f3fd07476a0-catalog-content\") pod \"community-operators-8gnvl\" (UID: \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\") " pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.724002 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b9ace6-42a9-4565-b61c-4f3fd07476a0-utilities\") pod \"community-operators-8gnvl\" (UID: \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\") " pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.750212 4914 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br5c9\" (UniqueName: \"kubernetes.io/projected/94b9ace6-42a9-4565-b61c-4f3fd07476a0-kube-api-access-br5c9\") pod \"community-operators-8gnvl\" (UID: \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\") " pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:29 crc kubenswrapper[4914]: I0130 22:20:29.857295 4914 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:30 crc kubenswrapper[4914]: I0130 22:20:30.495503 4914 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8gnvl"] Jan 30 22:20:31 crc kubenswrapper[4914]: I0130 22:20:31.503681 4914 generic.go:334] "Generic (PLEG): container finished" podID="94b9ace6-42a9-4565-b61c-4f3fd07476a0" containerID="d452ef7f0d79ad675046d9bdbb58b05d7f2d71c1109cfef56eefd173f30c9d83" exitCode=0 Jan 30 22:20:31 crc kubenswrapper[4914]: I0130 22:20:31.503761 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gnvl" event={"ID":"94b9ace6-42a9-4565-b61c-4f3fd07476a0","Type":"ContainerDied","Data":"d452ef7f0d79ad675046d9bdbb58b05d7f2d71c1109cfef56eefd173f30c9d83"} Jan 30 22:20:31 crc kubenswrapper[4914]: I0130 22:20:31.504059 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gnvl" event={"ID":"94b9ace6-42a9-4565-b61c-4f3fd07476a0","Type":"ContainerStarted","Data":"499c873901263d6ba0f386b00611ceaf0808e7d8d3273e4ac61fcaf8d7a643d6"} Jan 30 22:20:31 crc kubenswrapper[4914]: I0130 22:20:31.505694 4914 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 22:20:33 crc kubenswrapper[4914]: I0130 22:20:33.523576 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gnvl" event={"ID":"94b9ace6-42a9-4565-b61c-4f3fd07476a0","Type":"ContainerStarted","Data":"9928d4e748fa23be5f658e03fd43e5412eda65cdf8898443d58a7b6e8ba72dd8"} Jan 30 22:20:34 crc kubenswrapper[4914]: I0130 22:20:34.540246 4914 generic.go:334] "Generic (PLEG): container finished" podID="94b9ace6-42a9-4565-b61c-4f3fd07476a0" containerID="9928d4e748fa23be5f658e03fd43e5412eda65cdf8898443d58a7b6e8ba72dd8" exitCode=0 Jan 30 22:20:34 crc kubenswrapper[4914]: I0130 22:20:34.540456 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gnvl" event={"ID":"94b9ace6-42a9-4565-b61c-4f3fd07476a0","Type":"ContainerDied","Data":"9928d4e748fa23be5f658e03fd43e5412eda65cdf8898443d58a7b6e8ba72dd8"} Jan 30 22:20:35 crc kubenswrapper[4914]: I0130 22:20:35.554719 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gnvl" event={"ID":"94b9ace6-42a9-4565-b61c-4f3fd07476a0","Type":"ContainerStarted","Data":"e18cb5a6a89bbfa1a4d4d3e402d4cb3812dd4d1bc910f6a33eee83905cc8f297"} Jan 30 22:20:35 crc kubenswrapper[4914]: I0130 22:20:35.585125 4914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8gnvl" podStartSLOduration=3.184924854 podStartE2EDuration="6.585095306s" podCreationTimestamp="2026-01-30 22:20:29 +0000 UTC" firstStartedPulling="2026-01-30 22:20:31.505393698 +0000 UTC m=+3964.944030469" lastFinishedPulling="2026-01-30 22:20:34.90556416 +0000 UTC m=+3968.344200921" observedRunningTime="2026-01-30 22:20:35.573718922 +0000 UTC m=+3969.012355683" watchObservedRunningTime="2026-01-30 22:20:35.585095306 +0000 UTC m=+3969.023732067" Jan 30 22:20:36 crc kubenswrapper[4914]: I0130 22:20:36.818550 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:20:36 crc kubenswrapper[4914]: E0130 22:20:36.819109 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:20:39 crc kubenswrapper[4914]: I0130 22:20:39.857448 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:39 crc kubenswrapper[4914]: I0130 22:20:39.858063 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:39 crc kubenswrapper[4914]: I0130 22:20:39.919086 4914 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:40 crc kubenswrapper[4914]: I0130 22:20:40.652519 4914 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:40 crc kubenswrapper[4914]: I0130 22:20:40.704943 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8gnvl"] Jan 30 22:20:42 crc kubenswrapper[4914]: I0130 22:20:42.616434 4914 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8gnvl" podUID="94b9ace6-42a9-4565-b61c-4f3fd07476a0" containerName="registry-server" containerID="cri-o://e18cb5a6a89bbfa1a4d4d3e402d4cb3812dd4d1bc910f6a33eee83905cc8f297" gracePeriod=2 Jan 30 22:20:43 crc kubenswrapper[4914]: I0130 22:20:43.634952 4914 generic.go:334] "Generic (PLEG): container finished" podID="94b9ace6-42a9-4565-b61c-4f3fd07476a0" containerID="e18cb5a6a89bbfa1a4d4d3e402d4cb3812dd4d1bc910f6a33eee83905cc8f297" exitCode=0 Jan 30 22:20:43 crc kubenswrapper[4914]: I0130 22:20:43.635333 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gnvl" event={"ID":"94b9ace6-42a9-4565-b61c-4f3fd07476a0","Type":"ContainerDied","Data":"e18cb5a6a89bbfa1a4d4d3e402d4cb3812dd4d1bc910f6a33eee83905cc8f297"} Jan 30 22:20:44 crc kubenswrapper[4914]: I0130 22:20:44.079154 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:44 crc kubenswrapper[4914]: I0130 22:20:44.163462 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b9ace6-42a9-4565-b61c-4f3fd07476a0-utilities\") pod \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\" (UID: \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\") " Jan 30 22:20:44 crc kubenswrapper[4914]: I0130 22:20:44.163511 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br5c9\" (UniqueName: \"kubernetes.io/projected/94b9ace6-42a9-4565-b61c-4f3fd07476a0-kube-api-access-br5c9\") pod \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\" (UID: \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\") " Jan 30 22:20:44 crc kubenswrapper[4914]: I0130 22:20:44.163547 4914 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b9ace6-42a9-4565-b61c-4f3fd07476a0-catalog-content\") pod \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\" (UID: \"94b9ace6-42a9-4565-b61c-4f3fd07476a0\") " Jan 30 22:20:44 crc kubenswrapper[4914]: I0130 22:20:44.164265 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94b9ace6-42a9-4565-b61c-4f3fd07476a0-utilities" (OuterVolumeSpecName: "utilities") pod "94b9ace6-42a9-4565-b61c-4f3fd07476a0" (UID: "94b9ace6-42a9-4565-b61c-4f3fd07476a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:20:44 crc kubenswrapper[4914]: I0130 22:20:44.170462 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b9ace6-42a9-4565-b61c-4f3fd07476a0-kube-api-access-br5c9" (OuterVolumeSpecName: "kube-api-access-br5c9") pod "94b9ace6-42a9-4565-b61c-4f3fd07476a0" (UID: "94b9ace6-42a9-4565-b61c-4f3fd07476a0"). InnerVolumeSpecName "kube-api-access-br5c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:20:44 crc kubenswrapper[4914]: I0130 22:20:44.266052 4914 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b9ace6-42a9-4565-b61c-4f3fd07476a0-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:20:44 crc kubenswrapper[4914]: I0130 22:20:44.266082 4914 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br5c9\" (UniqueName: \"kubernetes.io/projected/94b9ace6-42a9-4565-b61c-4f3fd07476a0-kube-api-access-br5c9\") on node \"crc\" DevicePath \"\"" Jan 30 22:20:44 crc kubenswrapper[4914]: I0130 22:20:44.652864 4914 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gnvl" event={"ID":"94b9ace6-42a9-4565-b61c-4f3fd07476a0","Type":"ContainerDied","Data":"499c873901263d6ba0f386b00611ceaf0808e7d8d3273e4ac61fcaf8d7a643d6"} Jan 30 22:20:44 crc kubenswrapper[4914]: I0130 22:20:44.653236 4914 scope.go:117] "RemoveContainer" containerID="e18cb5a6a89bbfa1a4d4d3e402d4cb3812dd4d1bc910f6a33eee83905cc8f297" Jan 30 22:20:44 crc kubenswrapper[4914]: I0130 22:20:44.652956 4914 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gnvl" Jan 30 22:20:44 crc kubenswrapper[4914]: I0130 22:20:44.674681 4914 scope.go:117] "RemoveContainer" containerID="9928d4e748fa23be5f658e03fd43e5412eda65cdf8898443d58a7b6e8ba72dd8" Jan 30 22:20:44 crc kubenswrapper[4914]: I0130 22:20:44.796797 4914 scope.go:117] "RemoveContainer" containerID="d452ef7f0d79ad675046d9bdbb58b05d7f2d71c1109cfef56eefd173f30c9d83" Jan 30 22:20:45 crc kubenswrapper[4914]: I0130 22:20:45.600150 4914 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94b9ace6-42a9-4565-b61c-4f3fd07476a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94b9ace6-42a9-4565-b61c-4f3fd07476a0" (UID: "94b9ace6-42a9-4565-b61c-4f3fd07476a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:20:45 crc kubenswrapper[4914]: I0130 22:20:45.698791 4914 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b9ace6-42a9-4565-b61c-4f3fd07476a0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:20:45 crc kubenswrapper[4914]: I0130 22:20:45.881284 4914 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8gnvl"] Jan 30 22:20:45 crc kubenswrapper[4914]: I0130 22:20:45.893752 4914 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8gnvl"] Jan 30 22:20:47 crc kubenswrapper[4914]: I0130 22:20:47.832957 4914 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b9ace6-42a9-4565-b61c-4f3fd07476a0" path="/var/lib/kubelet/pods/94b9ace6-42a9-4565-b61c-4f3fd07476a0/volumes" Jan 30 22:20:49 crc kubenswrapper[4914]: I0130 22:20:49.818284 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:20:49 crc kubenswrapper[4914]: E0130 22:20:49.818848 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:21:00 crc kubenswrapper[4914]: I0130 22:21:00.818026 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:21:00 crc kubenswrapper[4914]: E0130 22:21:00.819643 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:21:11 crc kubenswrapper[4914]: I0130 22:21:11.818882 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:21:11 crc kubenswrapper[4914]: E0130 22:21:11.821191 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:21:22 crc kubenswrapper[4914]: I0130 22:21:22.818845 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:21:22 crc kubenswrapper[4914]: E0130 22:21:22.819534 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:21:32 crc kubenswrapper[4914]: I0130 22:21:32.444955 4914 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-674bc87d47-gkh5m" podUID="a34acb77-7da2-4edf-b829-d8b8ce25657e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.57:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 22:21:36 crc kubenswrapper[4914]: I0130 22:21:36.818045 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:21:36 crc kubenswrapper[4914]: E0130 22:21:36.818908 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:21:48 crc kubenswrapper[4914]: I0130 22:21:48.818434 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:21:48 crc kubenswrapper[4914]: E0130 22:21:48.819594 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:22:00 crc kubenswrapper[4914]: I0130 22:22:00.818420 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:22:00 crc kubenswrapper[4914]: E0130 22:22:00.819388 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:22:14 crc kubenswrapper[4914]: I0130 22:22:14.818529 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:22:14 crc kubenswrapper[4914]: E0130 22:22:14.819225 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" Jan 30 22:22:28 crc kubenswrapper[4914]: I0130 22:22:28.817865 4914 scope.go:117] "RemoveContainer" containerID="b3220bc493e0ac7f92cc673f9be61beffe96bca6cb6d08dfb3c35c6f261682e0" Jan 30 22:22:28 crc kubenswrapper[4914]: E0130 22:22:28.818594 4914 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pm2tg_openshift-machine-config-operator(3be0c366-7d83-42e6-9a85-3f77ce72281f)\"" pod="openshift-machine-config-operator/machine-config-daemon-pm2tg" podUID="3be0c366-7d83-42e6-9a85-3f77ce72281f" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515137227471024456 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015137227472017374 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015137217126016512 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015137217126015462 5ustar corecore